CONTEXT : A complex modern semiconductor manufacturing process is normally under constant surveillance via the monitoring of signals/variables collected from sensors and or process measurement points. However, not all of these signals are equally valuable in a specific monitoring system. The measured signals contain a combination of useful information, irrelevant information as well as noise. Engineers typically have a much larger number of signals than are actually required. If we consider each type of signal as a feature, then feature selection may be applied to identify the most relevant signals. The Process Engineers may then use these signals to determine key factors contributing to yield excursions downstream in the process. This will enable an increase in process throughput, decreased time to learning and reduce the per unit production costs. These signals can be used as features to predict the yield type. And by analysing and trying out different combinations of features, essential signals that are impacting the yield type can be identified.
DATA DESCRIPTION : sensor-data.csv : (1567, 592)
The data consists of 1567 examples each with 591 features. The dataset presented in this case represents a selection of such features where each example represents a single production entity with associated measured features and the labels represent a simple pass/fail yield for in house line testing. Target column “ –1” corresponds to
a pass and “1” corresponds to a fail and the data time stamp is for that specific test point.
PROJECT OBJECTIVE: We will build a classifier to predict the Pass/Fail yield of a particular process entity and analyse whether all the features are required to build the model or not.
Steps and tasks: [ Total Score: 60 points]
1. Import and explore the data.
#IMPORTING THE REQUIRED LIBRARIES
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn.preprocessing import binarize
from sklearn.metrics import confusion_matrix
from sklearn.linear_model import LogisticRegression
from sklearn.neighbors import KNeighborsClassifier
from scipy.stats import zscore
from sklearn.svm import SVC
from sklearn.metrics import confusion_matrix, classification_report,accuracy_score,f1_score
from sklearn.naive_bayes import GaussianNB
from sklearn.metrics import precision_recall_fscore_support
from sklearn.model_selection import train_test_split
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import RandomForestClassifier
from xgboost import XGBClassifier
from lightgbm import LGBMClassifier
from sklearn.ensemble import AdaBoostClassifier
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.model_selection import StratifiedKFold,cross_val_score,KFold
from imblearn.over_sampling import SMOTE,ADASYN
from imblearn.under_sampling import RandomUnderSampler
from imblearn.over_sampling import RandomOverSampler
from scipy.stats import randint as sp_randint
from scipy.stats import uniform as sp_uniform
from sklearn.model_selection import RandomizedSearchCV,GridSearchCV
from sklearn import metrics
import seaborn as sns
import plotly as py
import plotly.express as px
sns.set(color_codes=True)
%matplotlib inline
import warnings
warnings.filterwarnings('ignore')
pd.set_option('max_columns',None)
pd.set_option('max_rows',None)
pdata = pd.read_csv("signal-data.csv") #past dataset
print("The past dataset has", pdata.shape[0],"rows and", pdata.shape[1]," columns")
The past dataset has 1567 rows and 592 columns
pdata.head()
| Time | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | 158 | 159 | 160 | 161 | 162 | 163 | 164 | 165 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | 178 | 179 | 180 | 181 | 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 226 | 227 | 228 | 229 | 230 | 231 | 232 | 233 | 234 | 235 | 236 | 237 | 238 | 239 | 240 | 241 | 242 | 243 | 244 | 245 | 246 | 247 | 248 | 249 | 250 | 251 | 252 | 253 | 254 | 255 | 256 | 257 | 258 | 259 | 260 | 261 | 262 | 263 | 264 | 265 | 266 | 267 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 276 | 277 | 278 | 279 | 280 | 281 | 282 | 283 | 284 | 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 | 303 | 304 | 305 | 306 | 307 | 308 | 309 | 310 | 311 | 312 | 313 | 314 | 315 | 316 | 317 | 318 | 319 | 320 | 321 | 322 | 323 | 324 | 325 | 326 | 327 | 328 | 329 | 330 | 331 | 332 | 333 | 334 | 335 | 336 | 337 | 338 | 339 | 340 | 341 | 342 | 343 | 344 | 345 | 346 | 347 | 348 | 349 | 350 | 351 | 352 | 353 | 354 | 355 | 356 | 357 | 358 | 359 | 360 | 361 | 362 | 363 | 364 | 365 | 366 | 367 | 368 | 369 | 370 | 371 | 372 | 373 | 374 | 375 | 376 | 377 | 378 | 379 | 380 | 381 | 382 | 383 | 384 | 385 | 386 | 387 | 388 | 389 | 390 | 391 | 392 | 393 | 394 | 395 | 396 | 397 | 398 | 399 | 400 | 401 | 402 | 403 | 404 | 405 | 406 | 407 | 408 | 409 | 410 | 411 | 412 | 413 | 414 | 415 | 416 | 417 | 418 | 419 | 420 | 421 | 422 | 423 | 424 | 425 | 426 | 427 | 428 | 429 | 430 | 431 | 432 | 433 | 434 | 435 | 436 | 437 | 438 | 439 | 440 | 441 | 442 | 443 | 444 | 445 | 446 | 447 | 448 | 449 | 450 | 451 | 452 | 453 | 454 | 455 | 456 | 457 | 458 | 459 | 460 | 461 | 462 | 463 | 464 | 465 | 466 | 467 | 468 | 469 | 470 | 471 | 472 | 473 | 474 | 475 | 476 | 477 | 478 | 479 | 480 | 481 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 490 | 491 | 492 | 493 | 494 | 495 | 496 | 497 | 498 | 499 | 500 | 501 | 502 | 503 | 504 | 505 | 506 | 507 | 508 | 509 | 510 | 511 | 512 | 513 | 514 | 515 | 516 | 517 | 518 | 519 | 520 | 521 | 522 | 523 | 524 | 525 | 526 | 527 | 528 | 529 | 530 | 531 | 532 | 533 | 534 | 535 | 536 | 537 | 538 | 539 | 540 | 541 | 542 | 543 | 544 | 545 | 546 | 547 | 548 | 549 | 550 | 551 | 552 | 553 | 554 | 555 | 556 | 557 | 558 | 559 | 560 | 561 | 562 | 563 | 564 | 565 | 566 | 567 | 568 | 569 | 570 | 571 | 572 | 573 | 574 | 575 | 576 | 577 | 578 | 579 | 580 | 581 | 582 | 583 | 584 | 585 | 586 | 587 | 588 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2008-07-19 11:55:00 | 3030.93 | 2564.00 | 2187.7333 | 1411.1265 | 1.3602 | 100.0 | 97.6133 | 0.1242 | 1.5005 | 0.0162 | -0.0034 | 0.9455 | 202.4396 | 0.0 | 7.9558 | 414.8710 | 10.0433 | 0.9680 | 192.3963 | 12.5190 | 1.4026 | -5419.00 | 2916.50 | -4043.75 | 751.00 | 0.8955 | 1.7730 | 3.0490 | 64.2333 | 2.0222 | 0.1632 | 3.5191 | 83.3971 | 9.5126 | 50.6170 | 64.2588 | 49.3830 | 66.3141 | 86.9555 | 117.5132 | 61.29 | 4.515 | 70.0 | 352.7173 | 10.1841 | 130.3691 | 723.3092 | 1.3072 | 141.2282 | 1.0 | 624.3145 | 218.3174 | 0.0 | 4.592 | 4.841 | 2834.0 | 0.9317 | 0.9484 | 4.7057 | -1.7264 | 350.9264 | 10.6231 | 108.6427 | 16.1445 | 21.7264 | 29.5367 | 693.7724 | 0.9226 | 148.6009 | 1.0 | 608.1700 | 84.0793 | NaN | NaN | 0.0 | 0.0126 | -0.0206 | 0.0141 | -0.0307 | -0.0083 | -0.0026 | -0.0567 | -0.0044 | 7.2163 | 0.1320 | NaN | 2.3895 | 0.9690 | 1747.6049 | 0.1841 | 8671.9301 | -0.3274 | -0.0055 | -0.0001 | 0.0001 | 0.0003 | -0.2786 | 0.0 | 0.3974 | -0.0251 | 0.0002 | 0.0002 | 0.1350 | -0.0042 | 0.0003 | 0.0056 | 0.0000 | -0.2468 | 0.3196 | NaN | NaN | NaN | NaN | 0.9460 | 0.0 | 748.6115 | 0.9908 | 58.4306 | 0.6002 | 0.9804 | 6.3788 | 15.88 | 2.639 | 15.94 | 15.93 | 0.8656 | 3.353 | 0.4098 | 3.188 | -0.0473 | 0.7243 | 0.9960 | 2.2967 | 1000.7263 | 39.2373 | 123.0 | 111.3 | 75.2 | 46.2000 | 350.6710 | 0.3948 | 0.0 | 6.78 | 0.0034 | 0.0898 | 0.0850 | 0.0358 | 0.0328 | 12.2566 | 0.0 | 4.271 | 10.284 | 0.4734 | 0.0167 | 11.8901 | 0.41 | 0.0506 | NaN | NaN | 1017.0 | 967.0 | 1066.0 | 368.0 | 0.090 | 0.048 | 0.095 | 2.0 | 0.9 | 0.069 | 0.046 | 0.7250 | 0.1139 | 0.3183 | 0.5888 | 0.3184 | 0.9499 | 0.3979 | 0.160 | 0.0 | 0.0 | 20.95 | 0.333 | 12.49 | 16.713 | 0.0803 | 5.72 | 0.0 | 11.19 | 65.363 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.292 | 5.38 | 20.10 | 0.296 | 10.62 | 10.30 | 5.38 | 4.040 | 16.230 | 0.2951 | 8.64 | 0.0 | 10.30 | 97.314 | 0.0 | 0.0772 | 0.0599 | 0.0700 | 0.0547 | 0.0704 | 0.0520 | 0.0301 | 0.1135 | 3.4789 | 0.0010 | NaN | 0.0707 | 0.0211 | 175.2173 | 0.0315 | 1940.3994 | 0.0 | 0.0744 | 0.0546 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0027 | 0.0040 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 0.0188 | 0.0 | 219.9453 | 0.0011 | 2.8374 | 0.0189 | 0.0050 | 0.4269 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0472 | 40.855 | 4.5152 | 30.9815 | 33.9606 | 22.9057 | 15.9525 | 110.2144 | 0.1310 | 0.0 | 2.5883 | 0.0010 | 0.0319 | 0.0197 | 0.0120 | 0.0109 | 3.9321 | 0.0 | 1.5123 | 3.5811 | 0.1337 | 0.0055 | 3.8447 | 0.1077 | 0.0167 | NaN | NaN | 418.1363 | 398.3185 | 496.1582 | 158.3330 | 0.0373 | 0.0202 | 0.0462 | 0.6083 | 0.3032 | 0.0200 | 0.0174 | 0.2827 | 0.0434 | 0.1342 | 0.2419 | 0.1343 | 0.3670 | 0.1431 | 0.0610 | 0.0 | 0.0 | 0.0 | 6.2698 | 0.1181 | 3.8208 | 5.3737 | 0.0254 | 1.6252 | 0.0 | 3.2461 | 18.0118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0752 | 1.5989 | 6.5893 | 0.0913 | 3.0911 | 8.4654 | 1.5989 | 1.2293 | 5.3406 | 0.0867 | 2.8551 | 0.0 | 2.9971 | 31.8843 | NaN | NaN | 0.0 | 0.0215 | 0.0274 | 0.0315 | 0.0238 | 0.0206 | 0.0238 | 0.0144 | 0.0491 | 1.2708 | 0.0004 | NaN | 0.0229 | 0.0065 | 55.2039 | 0.0105 | 560.2658 | 0.0 | 0.0170 | 0.0148 | 0.0124 | 0.0114 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0010 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 0.0055 | 0.0 | 61.5932 | 0.0003 | 0.9967 | 0.0082 | 0.0017 | 0.1437 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0151 | 14.2396 | 1.4392 | 5.6188 | 3.6721 | 2.9329 | 2.1118 | 24.8504 | 29.0271 | 0.0 | 6.9458 | 2.7380 | 5.9846 | 525.0965 | 0.0000 | 3.4641 | 6.0544 | 0.0 | 53.6840 | 2.4788 | 4.7141 | 1.7275 | 6.1800 | 3.2750 | 3.6084 | 18.7673 | 33.1562 | 26.3617 | 49.0013 | 10.0503 | 2.7073 | 3.1158 | 3.1136 | 44.5055 | 42.2737 | 1.3071 | 0.8693 | 1.1975 | 0.6288 | 0.9163 | 0.6448 | 1.4324 | 0.4576 | 0.1362 | 0.0 | 0.0 | 0.0 | 5.9396 | 3.2698 | 9.5805 | 2.3106 | 6.1463 | 4.0502 | 0.0 | 1.7924 | 29.9394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 6.2052 | 311.6377 | 5.7277 | 2.7864 | 9.7752 | 63.7987 | 24.7625 | 13.6778 | 2.3394 | 31.9893 | 5.8142 | 0.0 | 1.6936 | 115.7408 | 0.0 | 613.3069 | 291.4842 | 494.6996 | 178.1759 | 843.1138 | 0.0000 | 53.1098 | 0.0000 | 48.2091 | 0.7578 | NaN | 2.9570 | 2.1739 | 10.0261 | 17.1202 | 22.3756 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 64.6707 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 1.9864 | 0.0 | 29.3804 | 0.1094 | 4.8560 | 3.1406 | 0.5064 | 6.6926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 2.0570 | 4.0825 | 11.5074 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.0616 | 395.570 | 75.752 | 0.4234 | 12.93 | 0.78 | 0.1827 | 5.7349 | 0.3363 | 39.8842 | 3.2687 | 1.0297 | 1.0344 | 0.4385 | 0.1039 | 42.3877 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 533.8500 | 2.1113 | 8.95 | 0.3157 | 3.0624 | 0.1026 | 1.6765 | 14.9509 | NaN | NaN | NaN | NaN | 0.5005 | 0.0118 | 0.0035 | 2.3630 | NaN | NaN | NaN | NaN | -1 |
| 1 | 2008-07-19 12:32:00 | 3095.78 | 2465.14 | 2230.4222 | 1463.6606 | 0.8294 | 100.0 | 102.3433 | 0.1247 | 1.4966 | -0.0005 | -0.0148 | 0.9627 | 200.5470 | 0.0 | 10.1548 | 414.7347 | 9.2599 | 0.9701 | 191.2872 | 12.4608 | 1.3825 | -5441.50 | 2604.25 | -3498.75 | -1640.25 | 1.2973 | 2.0143 | 7.3900 | 68.4222 | 2.2667 | 0.2102 | 3.4171 | 84.9052 | 9.7997 | 50.6596 | 64.2828 | 49.3404 | 64.9193 | 87.5241 | 118.1188 | 78.25 | 2.773 | 70.0 | 352.2445 | 10.0373 | 133.1727 | 724.8264 | 1.2887 | 145.8445 | 1.0 | 631.2618 | 205.1695 | 0.0 | 4.590 | 4.842 | 2853.0 | 0.9324 | 0.9479 | 4.6820 | 0.8073 | 352.0073 | 10.3092 | 113.9800 | 10.9036 | 19.1927 | 27.6301 | 697.1964 | 1.1598 | 154.3709 | 1.0 | 620.3582 | 82.3494 | NaN | NaN | 0.0 | -0.0039 | -0.0198 | 0.0004 | -0.0440 | -0.0358 | -0.0120 | -0.0377 | 0.0017 | 6.8043 | 0.1358 | NaN | 2.3754 | 0.9894 | 1931.6464 | 0.1874 | 8407.0299 | 0.1455 | -0.0015 | 0.0000 | -0.0005 | 0.0001 | 0.5854 | 0.0 | -0.9353 | -0.0158 | -0.0004 | -0.0004 | -0.0752 | -0.0045 | 0.0002 | 0.0015 | 0.0000 | 0.0772 | -0.0903 | NaN | NaN | NaN | NaN | 0.9425 | 0.0 | 731.2517 | 0.9902 | 58.6680 | 0.5958 | 0.9731 | 6.5061 | 15.88 | 2.541 | 15.91 | 15.88 | 0.8703 | 2.771 | 0.4138 | 3.272 | -0.0946 | 0.8122 | 0.9985 | 2.2932 | 998.1081 | 37.9213 | 98.0 | 80.3 | 81.0 | 56.2000 | 219.7679 | 0.2301 | 0.0 | 5.70 | 0.0049 | 0.1356 | 0.0600 | 0.0547 | 0.0204 | 12.3319 | 0.0 | 6.285 | 13.077 | 0.5666 | 0.0144 | 11.8428 | 0.35 | 0.0437 | NaN | NaN | 568.0 | 59.0 | 297.0 | 3277.0 | 0.112 | 0.115 | 0.124 | 2.2 | 1.1 | 0.079 | 0.561 | 1.0498 | 0.1917 | 0.4115 | 0.6582 | 0.4115 | 1.0181 | 0.2315 | 0.325 | 0.0 | 0.0 | 17.99 | 0.439 | 10.14 | 16.358 | 0.0892 | 6.92 | 0.0 | 9.05 | 82.986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.222 | 3.74 | 19.59 | 0.316 | 11.65 | 8.02 | 3.74 | 3.659 | 15.078 | 0.3580 | 8.96 | 0.0 | 8.02 | 134.250 | 0.0 | 0.0566 | 0.0488 | 0.1651 | 0.1578 | 0.0468 | 0.0987 | 0.0734 | 0.0747 | 3.9578 | 0.0050 | NaN | 0.0761 | 0.0014 | 128.4285 | 0.0238 | 1988.0000 | 0.0 | 0.0203 | 0.0236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0064 | 0.0036 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 0.0154 | 0.0 | 193.0287 | 0.0007 | 3.8999 | 0.0187 | 0.0086 | 0.5749 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0411 | 29.743 | 3.6327 | 29.0598 | 28.9862 | 22.3163 | 17.4008 | 83.5542 | 0.0767 | 0.0 | 1.8459 | 0.0012 | 0.0440 | 0.0171 | 0.0154 | 0.0069 | 3.9011 | 0.0 | 2.1016 | 3.9483 | 0.1662 | 0.0049 | 3.7836 | 0.1000 | 0.0139 | NaN | NaN | 233.9865 | 26.5879 | 139.2082 | 1529.7622 | 0.0502 | 0.0561 | 0.0591 | 0.8151 | 0.3464 | 0.0291 | 0.1822 | 0.3814 | 0.0715 | 0.1667 | 0.2630 | 0.1667 | 0.3752 | 0.0856 | 0.1214 | 0.0 | 0.0 | 0.0 | 5.6522 | 0.1417 | 2.9939 | 5.2445 | 0.0264 | 1.8045 | 0.0 | 2.7661 | 23.6230 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0778 | 1.1506 | 5.9247 | 0.0878 | 3.3604 | 7.7421 | 1.1506 | 1.1265 | 5.0108 | 0.1013 | 2.4278 | 0.0 | 2.4890 | 41.7080 | NaN | NaN | 0.0 | 0.0142 | 0.0230 | 0.0768 | 0.0729 | 0.0143 | 0.0513 | 0.0399 | 0.0365 | 1.2474 | 0.0017 | NaN | 0.0248 | 0.0005 | 46.3453 | 0.0069 | 677.1873 | 0.0 | 0.0053 | 0.0059 | 0.0081 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 0.0049 | 0.0 | 65.0999 | 0.0002 | 1.1655 | 0.0068 | 0.0027 | 0.1921 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0120 | 10.5837 | 1.0323 | 4.3465 | 2.5939 | 3.2858 | 2.5197 | 15.0150 | 27.7464 | 0.0 | 5.5695 | 3.9300 | 9.0604 | 0.0000 | 368.9713 | 2.1196 | 6.1491 | 0.0 | 61.8918 | 3.1531 | 6.1188 | 1.4857 | 6.1911 | 2.8088 | 3.1595 | 10.4383 | 2.2655 | 8.4887 | 199.7866 | 8.6336 | 5.7093 | 1.6779 | 3.2153 | 48.5294 | 37.5793 | 16.4174 | 1.2364 | 1.9562 | 0.8123 | 1.0239 | 0.8340 | 1.5683 | 0.2645 | 0.2751 | 0.0 | 0.0 | 0.0 | 5.1072 | 4.3737 | 7.6142 | 2.2568 | 6.9233 | 4.7448 | 0.0 | 1.4336 | 40.4475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 4.7415 | 463.2883 | 5.5652 | 3.0652 | 10.2211 | 73.5536 | 19.4865 | 13.2430 | 2.1627 | 30.8643 | 5.8042 | 0.0 | 1.2928 | 163.0249 | 0.0 | 0.0000 | 246.7762 | 0.0000 | 359.0444 | 130.6350 | 820.7900 | 194.4371 | 0.0000 | 58.1666 | 3.6822 | NaN | 3.2029 | 0.1441 | 6.6487 | 12.6788 | 23.6469 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 141.4365 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 1.6292 | 0.0 | 26.3970 | 0.0673 | 6.6475 | 3.1310 | 0.8832 | 8.8370 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.7910 | 2.9799 | 9.5796 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.3526 | 408.798 | 74.640 | 0.7193 | 16.00 | 1.33 | 0.2829 | 7.1196 | 0.4989 | 53.1836 | 3.9139 | 1.7819 | 0.9634 | 0.1745 | 0.0375 | 18.1087 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 535.0164 | 2.4335 | 5.92 | 0.2653 | 2.0111 | 0.0772 | 1.1065 | 10.9003 | 0.0096 | 0.0201 | 0.0060 | 208.2045 | 0.5019 | 0.0223 | 0.0055 | 4.4447 | 0.0096 | 0.0201 | 0.0060 | 208.2045 | -1 |
| 2 | 2008-07-19 13:17:00 | 2932.61 | 2559.94 | 2186.4111 | 1698.0172 | 1.5102 | 100.0 | 95.4878 | 0.1241 | 1.4436 | 0.0041 | 0.0013 | 0.9615 | 202.0179 | 0.0 | 9.5157 | 416.7075 | 9.3144 | 0.9674 | 192.7035 | 12.5404 | 1.4123 | -5447.75 | 2701.75 | -4047.00 | -1916.50 | 1.3122 | 2.0295 | 7.5788 | 67.1333 | 2.3333 | 0.1734 | 3.5986 | 84.7569 | 8.6590 | 50.1530 | 64.1114 | 49.8470 | 65.8389 | 84.7327 | 118.6128 | 14.37 | 5.434 | 70.0 | 364.3782 | 9.8783 | 131.8027 | 734.7924 | 1.2992 | 141.0845 | 1.0 | 637.2655 | 185.7574 | 0.0 | 4.486 | 4.748 | 2936.0 | 0.9139 | 0.9447 | 4.5873 | 23.8245 | 364.5364 | 10.1685 | 115.6273 | 11.3019 | 16.1755 | 24.2829 | 710.5095 | 0.8694 | 145.8000 | 1.0 | 625.9636 | 84.7681 | 140.6972 | 485.2665 | 0.0 | -0.0078 | -0.0326 | -0.0052 | 0.0213 | -0.0054 | -0.1134 | -0.0182 | 0.0287 | 7.1041 | 0.1362 | NaN | 2.4532 | 0.9880 | 1685.8514 | 0.1497 | 9317.1698 | 0.0553 | 0.0006 | -0.0013 | 0.0000 | 0.0002 | -0.1343 | 0.0 | -0.1427 | 0.1218 | 0.0006 | -0.0001 | 0.0134 | -0.0026 | -0.0016 | -0.0006 | 0.0013 | -0.0301 | -0.0728 | NaN | NaN | NaN | 0.4684 | 0.9231 | 0.0 | 718.5777 | 0.9899 | 58.4808 | 0.6015 | 0.9772 | 6.4527 | 15.90 | 2.882 | 15.94 | 15.95 | 0.8798 | 3.094 | 0.4777 | 3.272 | -0.1892 | 0.8194 | 0.9978 | 2.2592 | 998.4440 | 42.0579 | 89.0 | 126.4 | 96.5 | 45.1001 | 306.0380 | 0.3263 | 0.0 | 8.33 | 0.0038 | 0.0754 | 0.0483 | 0.0619 | 0.0221 | 8.2660 | 0.0 | 4.819 | 8.443 | 0.4909 | 0.0177 | 8.2054 | 0.47 | 0.0497 | NaN | NaN | 562.0 | 788.0 | 759.0 | 2100.0 | 0.187 | 0.117 | 0.068 | 2.1 | 1.4 | 0.123 | 0.319 | 1.0824 | 0.0369 | 0.3141 | 0.5753 | 0.3141 | 0.9677 | 0.2706 | 0.326 | 0.0 | 0.0 | 17.78 | 0.745 | 13.31 | 22.912 | 0.1959 | 9.21 | 0.0 | 17.87 | 60.110 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.139 | 5.09 | 19.75 | 0.949 | 9.71 | 16.73 | 5.09 | 11.059 | 22.624 | 0.1164 | 13.30 | 0.0 | 16.73 | 79.618 | 0.0 | 0.0339 | 0.0494 | 0.0696 | 0.0406 | 0.0401 | 0.0840 | 0.0349 | 0.0718 | 2.4266 | 0.0014 | NaN | 0.0963 | 0.0152 | 182.4956 | 0.0284 | 839.6006 | 0.0 | 0.0192 | 0.0170 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.0040 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | 0.1729 | 0.0273 | 0.0 | 104.4042 | 0.0007 | 4.1446 | 0.0733 | 0.0063 | 0.4166 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0487 | 29.621 | 3.9133 | 23.5510 | 41.3837 | 32.6256 | 15.7716 | 97.3868 | 0.1117 | 0.0 | 2.5274 | 0.0012 | 0.0249 | 0.0152 | 0.0157 | 0.0075 | 2.8705 | 0.0 | 1.5306 | 2.5493 | 0.1479 | 0.0059 | 2.8046 | 0.1185 | 0.0167 | NaN | NaN | 251.4536 | 329.6406 | 325.0672 | 902.4576 | 0.0800 | 0.0583 | 0.0326 | 0.6964 | 0.4031 | 0.0416 | 0.1041 | 0.3846 | 0.0151 | 0.1288 | 0.2268 | 0.1288 | 0.3677 | 0.1175 | 0.1261 | 0.0 | 0.0 | 0.0 | 5.7247 | 0.2682 | 3.8541 | 6.1797 | 0.0546 | 2.5680 | 0.0 | 4.6067 | 16.0104 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0243 | 1.5481 | 5.9453 | 0.2777 | 3.1600 | 8.9855 | 1.5481 | 2.9844 | 6.2277 | 0.0353 | 3.7663 | 0.0 | 5.6983 | 24.7959 | 13.5664 | 15.4488 | 0.0 | 0.0105 | 0.0208 | 0.0327 | 0.0171 | 0.0116 | 0.0428 | 0.0154 | 0.0383 | 0.7786 | 0.0005 | NaN | 0.0302 | 0.0046 | 58.0575 | 0.0092 | 283.6616 | 0.0 | 0.0054 | 0.0043 | 0.0030 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | 0.0221 | 0.0100 | 0.0 | 28.7334 | 0.0003 | 1.2356 | 0.0190 | 0.0020 | 0.1375 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0190 | 11.4871 | 1.1798 | 4.0782 | 4.3102 | 3.7696 | 2.0627 | 18.0233 | 21.6062 | 0.0 | 8.7236 | 3.0609 | 5.2231 | 0.0000 | 0.0000 | 2.2943 | 4.0917 | 0.0 | 50.6425 | 2.0261 | 5.2707 | 1.8268 | 4.2581 | 3.7479 | 3.5220 | 10.3162 | 29.1663 | 18.7546 | 109.5747 | 14.2503 | 5.7650 | 0.8972 | 3.1281 | 60.0000 | 70.9161 | 8.8647 | 1.2771 | 0.4264 | 0.6263 | 0.8973 | 0.6301 | 1.4698 | 0.3194 | 0.2748 | 0.0 | 0.0 | 0.0 | 4.8795 | 7.5418 | 10.0984 | 3.1182 | 15.0790 | 6.5280 | 0.0 | 2.8042 | 32.3594 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 3.0301 | 21.3645 | 5.4178 | 9.3327 | 8.3977 | 148.0287 | 31.4674 | 45.5423 | 3.1842 | 13.3923 | 9.1221 | 0.0 | 2.6727 | 93.9245 | 0.0 | 434.2674 | 151.7665 | 0.0000 | 190.3869 | 746.9150 | 74.0741 | 191.7582 | 250.1742 | 34.1573 | 1.0281 | NaN | 3.9238 | 1.5357 | 10.8251 | 18.9849 | 9.0113 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 240.7767 | 244.2748 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | 36.9067 | 2.9626 | 0.0 | 14.5293 | 0.0751 | 7.0870 | 12.1831 | 0.6451 | 6.4568 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 2.1538 | 2.9667 | 9.3046 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 0.7942 | 411.136 | 74.654 | 0.1832 | 16.16 | 0.85 | 0.0857 | 7.1619 | 0.3752 | 23.0713 | 3.9306 | 1.1386 | 1.5021 | 0.3718 | 0.1233 | 24.7524 | 267.064 | 0.9032 | 1.10 | 0.6219 | 0.4122 | 0.2562 | 0.4119 | 68.8489 | 535.0245 | 2.0293 | 11.21 | 0.1882 | 4.0923 | 0.0640 | 2.0952 | 9.2721 | 0.0584 | 0.0484 | 0.0148 | 82.8602 | 0.4958 | 0.0157 | 0.0039 | 3.1745 | 0.0584 | 0.0484 | 0.0148 | 82.8602 | 1 |
| 3 | 2008-07-19 14:43:00 | 2988.72 | 2479.90 | 2199.0333 | 909.7926 | 1.3204 | 100.0 | 104.2367 | 0.1217 | 1.4882 | -0.0124 | -0.0033 | 0.9629 | 201.8482 | 0.0 | 9.6052 | 422.2894 | 9.6924 | 0.9687 | 192.1557 | 12.4782 | 1.4011 | -5468.25 | 2648.25 | -4515.00 | -1657.25 | 1.3137 | 2.0038 | 7.3145 | 62.9333 | 2.6444 | 0.2071 | 3.3813 | 84.9105 | 8.6789 | 50.5100 | 64.1125 | 49.4900 | 65.1951 | 86.6867 | 117.0442 | 76.90 | 1.279 | 70.0 | 363.0273 | 9.9305 | 131.8027 | 733.8778 | 1.3027 | 142.5427 | 1.0 | 637.3727 | 189.9079 | 0.0 | 4.486 | 4.748 | 2936.0 | 0.9139 | 0.9447 | 4.5873 | 24.3791 | 361.4582 | 10.2112 | 116.1818 | 13.5597 | 15.6209 | 23.4736 | 710.4043 | 0.9761 | 147.6545 | 1.0 | 625.2945 | 70.2289 | 160.3210 | 464.9735 | 0.0 | -0.0555 | -0.0461 | -0.0400 | 0.0400 | 0.0676 | -0.1051 | 0.0028 | 0.0277 | 7.5925 | 0.1302 | NaN | 2.4004 | 0.9904 | 1752.0968 | 0.1958 | 8205.7000 | 0.0697 | -0.0003 | -0.0021 | -0.0001 | 0.0002 | 0.0411 | 0.0 | 0.0177 | -0.0195 | -0.0002 | 0.0000 | -0.0699 | -0.0059 | 0.0003 | 0.0003 | 0.0021 | -0.0483 | -0.1180 | NaN | NaN | NaN | 0.4647 | 0.9564 | 0.0 | 709.0867 | 0.9906 | 58.6635 | 0.6016 | 0.9761 | 6.4935 | 15.55 | 3.132 | 15.61 | 15.59 | 1.3660 | 2.480 | 0.5176 | 3.119 | 0.2838 | 0.7244 | 0.9961 | 2.3802 | 980.4510 | 41.1025 | 127.0 | 118.0 | 123.7 | 47.8000 | 162.4320 | 0.1915 | 0.0 | 5.51 | 0.0030 | 0.1140 | 0.0393 | 0.0613 | 0.0190 | 13.2651 | 0.0 | 9.073 | 15.241 | 1.3029 | 0.0150 | 11.9738 | 0.35 | 0.0699 | NaN | NaN | 859.0 | 355.0 | 3433.0 | 3004.0 | 0.068 | 0.108 | 0.100 | 1.7 | 0.9 | 0.086 | 0.241 | 0.9386 | 0.0356 | 0.2618 | 0.4391 | 0.2618 | 0.8567 | 0.2452 | 0.390 | 0.0 | 0.0 | 16.22 | 0.693 | 14.67 | 22.562 | 0.1786 | 5.69 | 0.0 | 18.20 | 52.571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.139 | 5.92 | 23.60 | 1.264 | 10.63 | 13.56 | 5.92 | 11.382 | 24.320 | 0.3458 | 9.56 | 0.0 | 21.97 | 104.950 | 0.0 | 0.1248 | 0.0463 | 0.1223 | 0.0354 | 0.0708 | 0.0754 | 0.0643 | 0.0932 | 5.5398 | 0.0023 | NaN | 0.0764 | 0.0015 | 152.0885 | 0.0573 | 820.3999 | 0.0 | 0.0152 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0067 | 0.0040 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | 0.0191 | 0.0234 | 0.0 | 94.0954 | 0.0010 | 3.2119 | 0.0406 | 0.0072 | 0.4212 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0513 | 31.830 | 3.1959 | 33.8960 | 37.8477 | 44.3906 | 16.9347 | 50.3631 | 0.0581 | 0.0 | 2.1775 | 0.0007 | 0.0417 | 0.0115 | 0.0172 | 0.0063 | 4.2154 | 0.0 | 2.8960 | 4.0526 | 0.3882 | 0.0049 | 3.9403 | 0.0916 | 0.0245 | NaN | NaN | 415.5048 | 157.0889 | 1572.6896 | 1377.4276 | 0.0285 | 0.0445 | 0.0465 | 0.6305 | 0.3046 | 0.0286 | 0.0824 | 0.3483 | 0.0128 | 0.1004 | 0.1701 | 0.1004 | 0.3465 | 0.0973 | 0.1675 | 0.0 | 0.0 | 0.0 | 5.4440 | 0.2004 | 4.1900 | 6.3329 | 0.0479 | 1.7339 | 0.0 | 4.9660 | 15.7375 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0243 | 1.7317 | 6.6262 | 0.3512 | 3.2699 | 9.4020 | 1.7317 | 3.0672 | 6.6839 | 0.0928 | 3.0229 | 0.0 | 6.3292 | 29.0339 | 8.4026 | 4.8851 | 0.0 | 0.0407 | 0.0198 | 0.0531 | 0.0167 | 0.0224 | 0.0422 | 0.0273 | 0.0484 | 1.8222 | 0.0006 | NaN | 0.0252 | 0.0004 | 45.7058 | 0.0188 | 309.8492 | 0.0 | 0.0046 | 0.0049 | 0.0028 | 0.0034 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0024 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | 0.0038 | 0.0068 | 0.0 | 32.4228 | 0.0003 | 1.1135 | 0.0132 | 0.0023 | 0.1348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0155 | 13.3972 | 1.1907 | 5.6363 | 3.9482 | 4.9881 | 2.1737 | 17.8537 | 14.5054 | 0.0 | 5.2860 | 2.4643 | 7.6602 | 317.7362 | 0.0000 | 1.9689 | 6.5718 | 0.0 | 94.4594 | 3.6091 | 13.4420 | 1.5441 | 6.2313 | 2.8049 | 4.9898 | 15.7089 | 13.4051 | 76.0354 | 181.2641 | 5.1760 | 5.3899 | 1.3671 | 2.7013 | 34.0336 | 41.5236 | 7.1274 | 1.1054 | 0.4097 | 0.5183 | 0.6849 | 0.5290 | 1.3141 | 0.2829 | 0.3332 | 0.0 | 0.0 | 0.0 | 4.4680 | 6.9785 | 11.1303 | 3.0744 | 13.7105 | 3.9918 | 0.0 | 2.8555 | 27.6824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 3.0301 | 24.2831 | 6.5291 | 12.3786 | 9.1494 | 100.0021 | 37.8979 | 48.4887 | 3.4234 | 35.4323 | 6.4746 | 0.0 | 3.5135 | 149.4399 | 0.0 | 225.0169 | 100.4883 | 305.7500 | 88.5553 | 104.6660 | 71.7583 | 0.0000 | 336.7660 | 72.9635 | 1.7670 | NaN | 3.1817 | 0.1488 | 8.6804 | 29.2542 | 9.9979 | 0.0 | 0.0000 | 711.6418 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 113.5593 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | 4.1200 | 2.4416 | 0.0 | 13.2699 | 0.0977 | 5.4751 | 6.7553 | 0.7404 | 6.4865 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 2.1565 | 3.2465 | 7.7754 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.1650 | 372.822 | 72.442 | 1.8804 | 131.68 | 39.33 | 0.6812 | 56.9303 | 17.4781 | 161.4081 | 35.3198 | 54.2917 | 1.1613 | 0.7288 | 0.2710 | 62.7572 | 268.228 | 0.6511 | 7.32 | 0.1630 | 3.5611 | 0.0670 | 2.7290 | 25.0363 | 530.5682 | 2.0253 | 9.33 | 0.1738 | 2.8971 | 0.0525 | 1.7585 | 8.5831 | 0.0202 | 0.0149 | 0.0044 | 73.8432 | 0.4990 | 0.0103 | 0.0025 | 2.0544 | 0.0202 | 0.0149 | 0.0044 | 73.8432 | -1 |
| 4 | 2008-07-19 15:22:00 | 3032.24 | 2502.87 | 2233.3667 | 1326.5200 | 1.5334 | 100.0 | 100.3967 | 0.1235 | 1.5031 | -0.0031 | -0.0072 | 0.9569 | 201.9424 | 0.0 | 10.5661 | 420.5925 | 10.3387 | 0.9735 | 191.6037 | 12.4735 | 1.3888 | -5476.25 | 2635.25 | -3987.50 | 117.00 | 1.2887 | 1.9912 | 7.2748 | 62.8333 | 3.1556 | 0.2696 | 3.2728 | 86.3269 | 8.7677 | 50.2480 | 64.1511 | 49.7520 | 66.1542 | 86.1468 | 121.4364 | 76.39 | 2.209 | 70.0 | 353.3400 | 10.4091 | 176.3136 | 789.7523 | 1.0341 | 138.0882 | 1.0 | 667.7418 | 233.5491 | 0.0 | 4.624 | 4.894 | 2865.0 | 0.9298 | 0.9449 | 4.6414 | -12.2945 | 355.0809 | 9.7948 | 144.0191 | 21.9782 | 32.2945 | 44.1498 | 745.6025 | 0.9256 | 146.6636 | 1.0 | 645.7636 | 65.8417 | NaN | NaN | 0.0 | -0.0534 | 0.0183 | -0.0167 | -0.0449 | 0.0034 | -0.0178 | -0.0123 | -0.0048 | 7.5017 | 0.1342 | NaN | 2.4530 | 0.9902 | 1828.3846 | 0.1829 | 9014.4600 | 0.0448 | -0.0077 | -0.0001 | -0.0001 | -0.0001 | 0.2189 | 0.0 | -0.6704 | -0.0167 | 0.0004 | -0.0003 | 0.0696 | -0.0045 | 0.0002 | 0.0078 | 0.0000 | -0.0799 | -0.2038 | NaN | NaN | NaN | NaN | 0.9424 | 0.0 | 796.5950 | 0.9908 | 58.3858 | 0.5913 | 0.9628 | 6.3551 | 15.75 | 3.148 | 15.73 | 15.71 | 0.9460 | 3.027 | 0.5328 | 3.299 | -0.5677 | 0.7780 | 1.0010 | 2.3715 | 993.1274 | 38.1448 | 119.0 | 143.2 | 123.1 | 48.8000 | 296.3030 | 0.3744 | 0.0 | 3.64 | 0.0041 | 0.0634 | 0.0451 | 0.0623 | 0.0240 | 14.2354 | 0.0 | 9.005 | 12.506 | 0.4434 | 0.0126 | 13.9047 | 0.43 | 0.0538 | NaN | NaN | 699.0 | 283.0 | 1747.0 | 1443.0 | 0.147 | 0.040 | 0.113 | 3.9 | 0.8 | 0.101 | 0.499 | 0.5760 | 0.0631 | 0.3053 | 0.5830 | 0.3053 | 0.8285 | 0.1308 | 0.922 | 0.0 | 0.0 | 15.24 | 0.282 | 10.85 | 37.715 | 0.1189 | 3.98 | 0.0 | 25.54 | 72.149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.250 | 5.52 | 15.76 | 0.519 | 10.71 | 19.77 | 5.52 | 8.446 | 33.832 | 0.3951 | 9.09 | 0.0 | 19.77 | 92.307 | 0.0 | 0.0915 | 0.0506 | 0.0769 | 0.1079 | 0.0797 | 0.1047 | 0.0924 | 0.1015 | 4.1338 | 0.0030 | NaN | 0.0802 | 0.0004 | 69.1510 | 0.1970 | 1406.4004 | 0.0 | 0.0227 | 0.0272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0067 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 0.0240 | 0.0 | 149.2172 | 0.0006 | 2.5775 | 0.0177 | 0.0214 | 0.4051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0488 | 19.862 | 3.6163 | 34.1250 | 55.9626 | 53.0876 | 17.4864 | 88.7672 | 0.1092 | 0.0 | 1.0929 | 0.0013 | 0.0257 | 0.0116 | 0.0163 | 0.0080 | 4.4239 | 0.0 | 3.2376 | 3.6536 | 0.1293 | 0.0040 | 4.3474 | 0.1275 | 0.0181 | NaN | NaN | 319.1252 | 128.0296 | 799.5884 | 628.3083 | 0.0755 | 0.0181 | 0.0476 | 1.3500 | 0.2698 | 0.0320 | 0.1541 | 0.2155 | 0.0310 | 0.1354 | 0.2194 | 0.1354 | 0.3072 | 0.0582 | 0.3574 | 0.0 | 0.0 | 0.0 | 4.8956 | 0.0766 | 2.9130 | 11.0583 | 0.0327 | 1.1229 | 0.0 | 7.3296 | 23.1160 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0822 | 1.6216 | 4.7279 | 0.1773 | 3.1550 | 9.7777 | 1.6216 | 2.5923 | 10.5352 | 0.1301 | 3.0939 | 0.0 | 6.3767 | 32.0537 | NaN | NaN | 0.0 | 0.0246 | 0.0221 | 0.0329 | 0.0522 | 0.0256 | 0.0545 | 0.0476 | 0.0463 | 1.5530 | 0.0010 | NaN | 0.0286 | 0.0001 | 21.0312 | 0.0573 | 494.7368 | 0.0 | 0.0063 | 0.0077 | 0.0052 | 0.0027 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0012 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 0.0089 | 0.0 | 57.2692 | 0.0002 | 0.8495 | 0.0065 | 0.0077 | 0.1356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0165 | 7.1493 | 1.1704 | 5.3823 | 4.7226 | 4.9184 | 2.1850 | 22.3369 | 24.4142 | 0.0 | 3.6256 | 3.3208 | 4.2178 | 0.0000 | 866.0295 | 2.5046 | 7.0492 | 0.0 | 85.2255 | 2.9734 | 4.2892 | 1.2943 | 7.2570 | 3.4473 | 3.8754 | 12.7642 | 10.7390 | 43.8119 | 0.0000 | 11.4064 | 2.0088 | 1.5533 | 6.2069 | 25.3521 | 37.4691 | 15.2470 | 0.6672 | 0.7198 | 0.6076 | 0.9088 | 0.6136 | 1.2524 | 0.1518 | 0.7592 | 0.0 | 0.0 | 0.0 | 4.3131 | 2.7092 | 6.1538 | 4.7756 | 11.4945 | 2.8822 | 0.0 | 3.8248 | 30.8924 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 5.3863 | 44.8980 | 4.4384 | 5.2987 | 7.4365 | 89.9529 | 17.0927 | 19.1303 | 4.5375 | 42.6838 | 6.1979 | 0.0 | 3.0615 | 140.1953 | 0.0 | 171.4486 | 276.8810 | 461.8619 | 240.1781 | 0.0000 | 587.3773 | 748.1781 | 0.0000 | 55.1057 | 2.2358 | NaN | 3.2712 | 0.0372 | 3.7821 | 107.6905 | 15.6016 | 0.0 | 293.1396 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 148.0663 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | NaN | NaN | NaN | NaN | 2.5512 | 0.0 | 18.7319 | 0.0616 | 4.4146 | 2.9954 | 2.2181 | 6.3745 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 2.0579 | 1.9999 | 9.4805 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.4636 | 399.914 | 79.156 | 1.0388 | 19.63 | 1.98 | 0.4287 | 9.7608 | 0.8311 | 70.9706 | 4.9086 | 2.5014 | 0.9778 | 0.2156 | 0.0461 | 22.0500 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 532.0155 | 2.0275 | 8.83 | 0.2224 | 3.1776 | 0.0706 | 1.6597 | 10.9698 | NaN | NaN | NaN | NaN | 0.4800 | 0.4766 | 0.1045 | 99.3032 | 0.0202 | 0.0149 | 0.0044 | 73.8432 | -1 |
fdata = pd.read_excel("Future_predictions.xlsx")
print("The future dataset has", fdata.shape[0],"rows and", fdata.shape[1]," columns")
The future dataset has 18 rows and 591 columns
fdata.head()
| Time | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | 158 | 159 | 160 | 161 | 162 | 163 | 164 | 165 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | 178 | 179 | 180 | 181 | 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 226 | 227 | 228 | 229 | 230 | 231 | 232 | 233 | 234 | 235 | 236 | 237 | 238 | 239 | 240 | 241 | 242 | 243 | 244 | 245 | 246 | 247 | 248 | 249 | 250 | 251 | 252 | 253 | 254 | 255 | 256 | 257 | 258 | 259 | 260 | 261 | 262 | 263 | 264 | 265 | 266 | 267 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 276 | 277 | 278 | 279 | 280 | 281 | 282 | 283 | 284 | 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 | 303 | 304 | 305 | 306 | 307 | 308 | 309 | 310 | 311 | 312 | 313 | 314 | 315 | 316 | 317 | 318 | 319 | 320 | 321 | 322 | 323 | 324 | 325 | 326 | 327 | 328 | 329 | 330 | 331 | 332 | 333 | 334 | 335 | 336 | 337 | 338 | 339 | 340 | 341 | 342 | 343 | 344 | 345 | 346 | 347 | 348 | 349 | 350 | 351 | 352 | 353 | 354 | 355 | 356 | 357 | 358 | 359 | 360 | 361 | 362 | 363 | 364 | 365 | 366 | 367 | 368 | 369 | 370 | 371 | 372 | 373 | 374 | 375 | 376 | 377 | 378 | 379 | 380 | 381 | 382 | 383 | 384 | 385 | 386 | 387 | 388 | 389 | 390 | 391 | 392 | 393 | 394 | 395 | 396 | 397 | 398 | 399 | 400 | 401 | 402 | 403 | 404 | 405 | 406 | 407 | 408 | 409 | 410 | 411 | 412 | 413 | 414 | 415 | 416 | 417 | 418 | 419 | 420 | 421 | 422 | 423 | 424 | 425 | 426 | 427 | 428 | 429 | 430 | 431 | 432 | 433 | 434 | 435 | 436 | 437 | 438 | 439 | 440 | 441 | 442 | 443 | 444 | 445 | 446 | 447 | 448 | 449 | 450 | 451 | 452 | 453 | 454 | 455 | 456 | 457 | 458 | 459 | 460 | 461 | 462 | 463 | 464 | 465 | 466 | 467 | 468 | 469 | 470 | 471 | 472 | 473 | 474 | 475 | 476 | 477 | 478 | 479 | 480 | 481 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 490 | 491 | 492 | 493 | 494 | 495 | 496 | 497 | 498 | 499 | 500 | 501 | 502 | 503 | 504 | 505 | 506 | 507 | 508 | 509 | 510 | 511 | 512 | 513 | 514 | 515 | 516 | 517 | 518 | 519 | 520 | 521 | 522 | 523 | 524 | 525 | 526 | 527 | 528 | 529 | 530 | 531 | 532 | 533 | 534 | 535 | 536 | 537 | 538 | 539 | 540 | 541 | 542 | 543 | 544 | 545 | 546 | 547 | 548 | 549 | 550 | 551 | 552 | 553 | 554 | 555 | 556 | 557 | 558 | 559 | 560 | 561 | 562 | 563 | 564 | 565 | 566 | 567 | 568 | 569 | 570 | 571 | 572 | 573 | 574 | 575 | 576 | 577 | 578 | 579 | 580 | 581 | 582 | 583 | 584 | 585 | 586 | 587 | 588 | 589 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2008-07-19 11:55:00 | 3030.93 | 2564.00 | 2187.7333 | 1411.1265 | 1.3602 | 100 | 97.6133 | 0.1242 | 1.5005 | 0.0162 | -0.0034 | 0.9455 | 202.4396 | 0 | 7.9558 | 414.8710 | 10.0433 | 0.9680 | 192.3963 | 12.5190 | 1.4026 | -5419.00 | 2916.50 | -4043.75 | 751.00 | 0.8955 | 1.7730 | 3.0490 | 64.2333 | 2.0222 | 0.1632 | 3.5191 | 83.3971 | 9.5126 | 50.6170 | 64.2588 | 49.3830 | 66.3141 | 86.9555 | 117.5132 | 61.29 | 4.515 | 70 | 352.7173 | 10.1841 | 130.3691 | 723.3092 | 1.3072 | 141.2282 | 1 | 624.3145 | 218.3174 | 0 | 4.592 | 4.841 | 2834 | 0.9317 | 0.9484 | 4.7057 | -1.7264 | 350.9264 | 10.6231 | 108.6427 | 16.1445 | 21.7264 | 29.5367 | 693.7724 | 0.9226 | 148.6009 | 1 | 608.1700 | 84.0793 | NaN | NaN | 0 | 0.0126 | -0.0206 | 0.0141 | -0.0307 | -0.0083 | -0.0026 | -0.0567 | -0.0044 | 7.2163 | 0.1320 | NaN | 2.3895 | 0.9690 | 1747.6049 | 0.1841 | 8671.9301 | -0.3274 | -0.0055 | -0.0001 | 0.0001 | 0.0003 | -0.2786 | 0 | 0.3974 | -0.0251 | 0.0002 | 0.0002 | 0.1350 | -0.0042 | 0.0003 | 0.0056 | 0.0000 | -0.2468 | 0.3196 | NaN | NaN | NaN | NaN | 0.9460 | 0 | 748.6115 | 0.9908 | 58.4306 | 0.6002 | 0.9804 | 6.3788 | 15.88 | 2.639 | 15.94 | 15.93 | 0.8656 | 3.353 | 0.4098 | 3.188 | -0.0473 | 0.7243 | 0.9960 | 2.2967 | 1000.7263 | 39.2373 | 123 | 111.3 | 75.2 | 46.2000 | 350.6710 | 0.3948 | 0 | 6.78 | 0.0034 | 0.0898 | 0.0850 | 0.0358 | 0.0328 | 12.2566 | 0 | 4.271 | 10.284 | 0.4734 | 0.0167 | 11.8901 | 0.41 | 0.0506 | NaN | NaN | 1017 | 967 | 1066 | 368 | 0.090 | 0.048 | 0.095 | 2.0 | 0.9 | 0.069 | 0.046 | 0.7250 | 0.1139 | 0.3183 | 0.5888 | 0.3184 | 0.9499 | 0.3979 | 0.160 | 0 | 0 | 20.95 | 0.333 | 12.49 | 16.713 | 0.0803 | 5.72 | 0 | 11.19 | 65.363 | 0 | 0 | 0 | 0 | 0 | 0 | 0.292 | 5.38 | 20.10 | 0.296 | 10.62 | 10.30 | 5.38 | 4.040 | 16.230 | 0.2951 | 8.64 | 0 | 10.30 | 97.314 | 0 | 0.0772 | 0.0599 | 0.0700 | 0.0547 | 0.0704 | 0.0520 | 0.0301 | 0.1135 | 3.4789 | 0.0010 | NaN | 0.0707 | 0.0211 | 175.2173 | 0.0315 | 1940.3994 | 0 | 0.0744 | 0.0546 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0027 | 0.0040 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 0.0188 | 0 | 219.9453 | 0.0011 | 2.8374 | 0.0189 | 0.0050 | 0.4269 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0472 | 40.855 | 4.5152 | 30.9815 | 33.9606 | 22.9057 | 15.9525 | 110.2144 | 0.1310 | 0 | 2.5883 | 0.0010 | 0.0319 | 0.0197 | 0.0120 | 0.0109 | 3.9321 | 0 | 1.5123 | 3.5811 | 0.1337 | 0.0055 | 3.8447 | 0.1077 | 0.0167 | NaN | NaN | 418.1363 | 398.3185 | 496.1582 | 158.3330 | 0.0373 | 0.0202 | 0.0462 | 0.6083 | 0.3032 | 0.0200 | 0.0174 | 0.2827 | 0.0434 | 0.1342 | 0.2419 | 0.1343 | 0.3670 | 0.1431 | 0.0610 | 0 | 0 | 0 | 6.2698 | 0.1181 | 3.8208 | 5.3737 | 0.0254 | 1.6252 | 0 | 3.2461 | 18.0118 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0752 | 1.5989 | 6.5893 | 0.0913 | 3.0911 | 8.4654 | 1.5989 | 1.2293 | 5.3406 | 0.0867 | 2.8551 | 0 | 2.9971 | 31.8843 | NaN | NaN | 0 | 0.0215 | 0.0274 | 0.0315 | 0.0238 | 0.0206 | 0.0238 | 0.0144 | 0.0491 | 1.2708 | 0.0004 | NaN | 0.0229 | 0.0065 | 55.2039 | 0.0105 | 560.2658 | 0 | 0.0170 | 0.0148 | 0.0124 | 0.0114 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0010 | 0.0013 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 0.0055 | 0 | 61.5932 | 0.0003 | 0.9967 | 0.0082 | 0.0017 | 0.1437 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0151 | 14.2396 | 1.4392 | 5.6188 | 3.6721 | 2.9329 | 2.1118 | 24.8504 | 29.0271 | 0 | 6.9458 | 2.7380 | 5.9846 | 525.0965 | 0.0000 | 3.4641 | 6.0544 | 0 | 53.6840 | 2.4788 | 4.7141 | 1.7275 | 6.1800 | 3.2750 | 3.6084 | 18.7673 | 33.1562 | 26.3617 | 49.0013 | 10.0503 | 2.7073 | 3.1158 | 3.1136 | 44.5055 | 42.2737 | 1.3071 | 0.8693 | 1.1975 | 0.6288 | 0.9163 | 0.6448 | 1.4324 | 0.4576 | 0.1362 | 0 | 0 | 0 | 5.9396 | 3.2698 | 9.5805 | 2.3106 | 6.1463 | 4.0502 | 0 | 1.7924 | 29.9394 | 0 | 0 | 0 | 0 | 0 | 0 | 6.2052 | 311.6377 | 5.7277 | 2.7864 | 9.7752 | 63.7987 | 24.7625 | 13.6778 | 2.3394 | 31.9893 | 5.8142 | 0 | 1.6936 | 115.7408 | 0 | 613.3069 | 291.4842 | 494.6996 | 178.1759 | 843.1138 | 0.0000 | 53.1098 | 0.0000 | 48.2091 | 0.7578 | NaN | 2.9570 | 2.1739 | 10.0261 | 17.1202 | 22.3756 | 0 | 0.0000 | 0.0000 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 64.6707 | 0.0000 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 1.9864 | 0 | 29.3804 | 0.1094 | 4.8560 | 3.1406 | 0.5064 | 6.6926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2.0570 | 4.0825 | 11.5074 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.0616 | 395.570 | 75.752 | 0.4234 | 12.93 | 0.78 | 0.1827 | 5.7349 | 0.3363 | 39.8842 | 3.2687 | 1.0297 | 1.0344 | 0.4385 | 0.1039 | 42.3877 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 533.8500 | 2.1113 | 8.95 | 0.3157 | 3.0624 | 0.1026 | 1.6765 | 14.9509 | NaN | NaN | NaN | NaN | 0.5005 | 0.0118 | 0.0035 | 2.3630 | NaN | NaN | NaN | NaN |
| 1 | 2008-07-19 12:32:00 | 3095.78 | 2465.14 | 2230.4222 | 1463.6606 | 0.8294 | 100 | 102.3433 | 0.1247 | 1.4966 | -0.0005 | -0.0148 | 0.9627 | 200.5470 | 0 | 10.1548 | 414.7347 | 9.2599 | 0.9701 | 191.2872 | 12.4608 | 1.3825 | -5441.50 | 2604.25 | -3498.75 | -1640.25 | 1.2973 | 2.0143 | 7.3900 | 68.4222 | 2.2667 | 0.2102 | 3.4171 | 84.9052 | 9.7997 | 50.6596 | 64.2828 | 49.3404 | 64.9193 | 87.5241 | 118.1188 | 78.25 | 2.773 | 70 | 352.2445 | 10.0373 | 133.1727 | 724.8264 | 1.2887 | 145.8445 | 1 | 631.2618 | 205.1695 | 0 | 4.590 | 4.842 | 2853 | 0.9324 | 0.9479 | 4.6820 | 0.8073 | 352.0073 | 10.3092 | 113.9800 | 10.9036 | 19.1927 | 27.6301 | 697.1964 | 1.1598 | 154.3709 | 1 | 620.3582 | 82.3494 | NaN | NaN | 0 | -0.0039 | -0.0198 | 0.0004 | -0.0440 | -0.0358 | -0.0120 | -0.0377 | 0.0017 | 6.8043 | 0.1358 | NaN | 2.3754 | 0.9894 | 1931.6464 | 0.1874 | 8407.0299 | 0.1455 | -0.0015 | 0.0000 | -0.0005 | 0.0001 | 0.5854 | 0 | -0.9353 | -0.0158 | -0.0004 | -0.0004 | -0.0752 | -0.0045 | 0.0002 | 0.0015 | 0.0000 | 0.0772 | -0.0903 | NaN | NaN | NaN | NaN | 0.9425 | 0 | 731.2517 | 0.9902 | 58.6680 | 0.5958 | 0.9731 | 6.5061 | 15.88 | 2.541 | 15.91 | 15.88 | 0.8703 | 2.771 | 0.4138 | 3.272 | -0.0946 | 0.8122 | 0.9985 | 2.2932 | 998.1081 | 37.9213 | 98 | 80.3 | 81.0 | 56.2000 | 219.7679 | 0.2301 | 0 | 5.70 | 0.0049 | 0.1356 | 0.0600 | 0.0547 | 0.0204 | 12.3319 | 0 | 6.285 | 13.077 | 0.5666 | 0.0144 | 11.8428 | 0.35 | 0.0437 | NaN | NaN | 568 | 59 | 297 | 3277 | 0.112 | 0.115 | 0.124 | 2.2 | 1.1 | 0.079 | 0.561 | 1.0498 | 0.1917 | 0.4115 | 0.6582 | 0.4115 | 1.0181 | 0.2315 | 0.325 | 0 | 0 | 17.99 | 0.439 | 10.14 | 16.358 | 0.0892 | 6.92 | 0 | 9.05 | 82.986 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222 | 3.74 | 19.59 | 0.316 | 11.65 | 8.02 | 3.74 | 3.659 | 15.078 | 0.3580 | 8.96 | 0 | 8.02 | 134.250 | 0 | 0.0566 | 0.0488 | 0.1651 | 0.1578 | 0.0468 | 0.0987 | 0.0734 | 0.0747 | 3.9578 | 0.0050 | NaN | 0.0761 | 0.0014 | 128.4285 | 0.0238 | 1988.0000 | 0 | 0.0203 | 0.0236 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0064 | 0.0036 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 0.0154 | 0 | 193.0287 | 0.0007 | 3.8999 | 0.0187 | 0.0086 | 0.5749 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0411 | 29.743 | 3.6327 | 29.0598 | 28.9862 | 22.3163 | 17.4008 | 83.5542 | 0.0767 | 0 | 1.8459 | 0.0012 | 0.0440 | 0.0171 | 0.0154 | 0.0069 | 3.9011 | 0 | 2.1016 | 3.9483 | 0.1662 | 0.0049 | 3.7836 | 0.1000 | 0.0139 | NaN | NaN | 233.9865 | 26.5879 | 139.2082 | 1529.7622 | 0.0502 | 0.0561 | 0.0591 | 0.8151 | 0.3464 | 0.0291 | 0.1822 | 0.3814 | 0.0715 | 0.1667 | 0.2630 | 0.1667 | 0.3752 | 0.0856 | 0.1214 | 0 | 0 | 0 | 5.6522 | 0.1417 | 2.9939 | 5.2445 | 0.0264 | 1.8045 | 0 | 2.7661 | 23.6230 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0778 | 1.1506 | 5.9247 | 0.0878 | 3.3604 | 7.7421 | 1.1506 | 1.1265 | 5.0108 | 0.1013 | 2.4278 | 0 | 2.4890 | 41.7080 | NaN | NaN | 0 | 0.0142 | 0.0230 | 0.0768 | 0.0729 | 0.0143 | 0.0513 | 0.0399 | 0.0365 | 1.2474 | 0.0017 | NaN | 0.0248 | 0.0005 | 46.3453 | 0.0069 | 677.1873 | 0 | 0.0053 | 0.0059 | 0.0081 | 0.0033 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0022 | 0.0013 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 0.0049 | 0 | 65.0999 | 0.0002 | 1.1655 | 0.0068 | 0.0027 | 0.1921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0120 | 10.5837 | 1.0323 | 4.3465 | 2.5939 | 3.2858 | 2.5197 | 15.0150 | 27.7464 | 0 | 5.5695 | 3.9300 | 9.0604 | 0.0000 | 368.9713 | 2.1196 | 6.1491 | 0 | 61.8918 | 3.1531 | 6.1188 | 1.4857 | 6.1911 | 2.8088 | 3.1595 | 10.4383 | 2.2655 | 8.4887 | 199.7866 | 8.6336 | 5.7093 | 1.6779 | 3.2153 | 48.5294 | 37.5793 | 16.4174 | 1.2364 | 1.9562 | 0.8123 | 1.0239 | 0.8340 | 1.5683 | 0.2645 | 0.2751 | 0 | 0 | 0 | 5.1072 | 4.3737 | 7.6142 | 2.2568 | 6.9233 | 4.7448 | 0 | 1.4336 | 40.4475 | 0 | 0 | 0 | 0 | 0 | 0 | 4.7415 | 463.2883 | 5.5652 | 3.0652 | 10.2211 | 73.5536 | 19.4865 | 13.2430 | 2.1627 | 30.8643 | 5.8042 | 0 | 1.2928 | 163.0249 | 0 | 0.0000 | 246.7762 | 0.0000 | 359.0444 | 130.6350 | 820.7900 | 194.4371 | 0.0000 | 58.1666 | 3.6822 | NaN | 3.2029 | 0.1441 | 6.6487 | 12.6788 | 23.6469 | 0 | 0.0000 | 0.0000 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 141.4365 | 0.0000 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 1.6292 | 0 | 26.3970 | 0.0673 | 6.6475 | 3.1310 | 0.8832 | 8.8370 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1.7910 | 2.9799 | 9.5796 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.3526 | 408.798 | 74.640 | 0.7193 | 16.00 | 1.33 | 0.2829 | 7.1196 | 0.4989 | 53.1836 | 3.9139 | 1.7819 | 0.9634 | 0.1745 | 0.0375 | 18.1087 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 535.0164 | 2.4335 | 5.92 | 0.2653 | 2.0111 | 0.0772 | 1.1065 | 10.9003 | 0.0096 | 0.0201 | 0.0060 | 208.2045 | 0.5019 | 0.0223 | 0.0055 | 4.4447 | 0.0096 | 0.0201 | 0.0060 | 208.2045 |
| 2 | 2008-07-19 13:17:00 | 2932.61 | 2559.94 | 2186.4111 | 1698.0172 | 1.5102 | 100 | 95.4878 | 0.1241 | 1.4436 | 0.0041 | 0.0013 | 0.9615 | 202.0179 | 0 | 9.5157 | 416.7075 | 9.3144 | 0.9674 | 192.7035 | 12.5404 | 1.4123 | -5447.75 | 2701.75 | -4047.00 | -1916.50 | 1.3122 | 2.0295 | 7.5788 | 67.1333 | 2.3333 | 0.1734 | 3.5986 | 84.7569 | 8.6590 | 50.1530 | 64.1114 | 49.8470 | 65.8389 | 84.7327 | 118.6128 | 14.37 | 5.434 | 70 | 364.3782 | 9.8783 | 131.8027 | 734.7924 | 1.2992 | 141.0845 | 1 | 637.2655 | 185.7574 | 0 | 4.486 | 4.748 | 2936 | 0.9139 | 0.9447 | 4.5873 | 23.8245 | 364.5364 | 10.1685 | 115.6273 | 11.3019 | 16.1755 | 24.2829 | 710.5095 | 0.8694 | 145.8000 | 1 | 625.9636 | 84.7681 | 140.6972 | 485.2665 | 0 | -0.0078 | -0.0326 | -0.0052 | 0.0213 | -0.0054 | -0.1134 | -0.0182 | 0.0287 | 7.1041 | 0.1362 | NaN | 2.4532 | 0.9880 | 1685.8514 | 0.1497 | 9317.1698 | 0.0553 | 0.0006 | -0.0013 | 0.0000 | 0.0002 | -0.1343 | 0 | -0.1427 | 0.1218 | 0.0006 | -0.0001 | 0.0134 | -0.0026 | -0.0016 | -0.0006 | 0.0013 | -0.0301 | -0.0728 | NaN | NaN | NaN | 0.4684 | 0.9231 | 0 | 718.5777 | 0.9899 | 58.4808 | 0.6015 | 0.9772 | 6.4527 | 15.90 | 2.882 | 15.94 | 15.95 | 0.8798 | 3.094 | 0.4777 | 3.272 | -0.1892 | 0.8194 | 0.9978 | 2.2592 | 998.4440 | 42.0579 | 89 | 126.4 | 96.5 | 45.1001 | 306.0380 | 0.3263 | 0 | 8.33 | 0.0038 | 0.0754 | 0.0483 | 0.0619 | 0.0221 | 8.2660 | 0 | 4.819 | 8.443 | 0.4909 | 0.0177 | 8.2054 | 0.47 | 0.0497 | NaN | NaN | 562 | 788 | 759 | 2100 | 0.187 | 0.117 | 0.068 | 2.1 | 1.4 | 0.123 | 0.319 | 1.0824 | 0.0369 | 0.3141 | 0.5753 | 0.3141 | 0.9677 | 0.2706 | 0.326 | 0 | 0 | 17.78 | 0.745 | 13.31 | 22.912 | 0.1959 | 9.21 | 0 | 17.87 | 60.110 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139 | 5.09 | 19.75 | 0.949 | 9.71 | 16.73 | 5.09 | 11.059 | 22.624 | 0.1164 | 13.30 | 0 | 16.73 | 79.618 | 0 | 0.0339 | 0.0494 | 0.0696 | 0.0406 | 0.0401 | 0.0840 | 0.0349 | 0.0718 | 2.4266 | 0.0014 | NaN | 0.0963 | 0.0152 | 182.4956 | 0.0284 | 839.6006 | 0 | 0.0192 | 0.0170 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0062 | 0.0040 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | 0.1729 | 0.0273 | 0 | 104.4042 | 0.0007 | 4.1446 | 0.0733 | 0.0063 | 0.4166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0487 | 29.621 | 3.9133 | 23.5510 | 41.3837 | 32.6256 | 15.7716 | 97.3868 | 0.1117 | 0 | 2.5274 | 0.0012 | 0.0249 | 0.0152 | 0.0157 | 0.0075 | 2.8705 | 0 | 1.5306 | 2.5493 | 0.1479 | 0.0059 | 2.8046 | 0.1185 | 0.0167 | NaN | NaN | 251.4536 | 329.6406 | 325.0672 | 902.4576 | 0.0800 | 0.0583 | 0.0326 | 0.6964 | 0.4031 | 0.0416 | 0.1041 | 0.3846 | 0.0151 | 0.1288 | 0.2268 | 0.1288 | 0.3677 | 0.1175 | 0.1261 | 0 | 0 | 0 | 5.7247 | 0.2682 | 3.8541 | 6.1797 | 0.0546 | 2.5680 | 0 | 4.6067 | 16.0104 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0243 | 1.5481 | 5.9453 | 0.2777 | 3.1600 | 8.9855 | 1.5481 | 2.9844 | 6.2277 | 0.0353 | 3.7663 | 0 | 5.6983 | 24.7959 | 13.5664 | 15.4488 | 0 | 0.0105 | 0.0208 | 0.0327 | 0.0171 | 0.0116 | 0.0428 | 0.0154 | 0.0383 | 0.7786 | 0.0005 | NaN | 0.0302 | 0.0046 | 58.0575 | 0.0092 | 283.6616 | 0 | 0.0054 | 0.0043 | 0.0030 | 0.0037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0021 | 0.0015 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | 0.0221 | 0.0100 | 0 | 28.7334 | 0.0003 | 1.2356 | 0.0190 | 0.0020 | 0.1375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0190 | 11.4871 | 1.1798 | 4.0782 | 4.3102 | 3.7696 | 2.0627 | 18.0233 | 21.6062 | 0 | 8.7236 | 3.0609 | 5.2231 | 0.0000 | 0.0000 | 2.2943 | 4.0917 | 0 | 50.6425 | 2.0261 | 5.2707 | 1.8268 | 4.2581 | 3.7479 | 3.5220 | 10.3162 | 29.1663 | 18.7546 | 109.5747 | 14.2503 | 5.7650 | 0.8972 | 3.1281 | 60.0000 | 70.9161 | 8.8647 | 1.2771 | 0.4264 | 0.6263 | 0.8973 | 0.6301 | 1.4698 | 0.3194 | 0.2748 | 0 | 0 | 0 | 4.8795 | 7.5418 | 10.0984 | 3.1182 | 15.0790 | 6.5280 | 0 | 2.8042 | 32.3594 | 0 | 0 | 0 | 0 | 0 | 0 | 3.0301 | 21.3645 | 5.4178 | 9.3327 | 8.3977 | 148.0287 | 31.4674 | 45.5423 | 3.1842 | 13.3923 | 9.1221 | 0 | 2.6727 | 93.9245 | 0 | 434.2674 | 151.7665 | 0.0000 | 190.3869 | 746.9150 | 74.0741 | 191.7582 | 250.1742 | 34.1573 | 1.0281 | NaN | 3.9238 | 1.5357 | 10.8251 | 18.9849 | 9.0113 | 0 | 0.0000 | 0.0000 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 240.7767 | 244.2748 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | 36.9067 | 2.9626 | 0 | 14.5293 | 0.0751 | 7.0870 | 12.1831 | 0.6451 | 6.4568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2.1538 | 2.9667 | 9.3046 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 0.7942 | 411.136 | 74.654 | 0.1832 | 16.16 | 0.85 | 0.0857 | 7.1619 | 0.3752 | 23.0713 | 3.9306 | 1.1386 | 1.5021 | 0.3718 | 0.1233 | 24.7524 | 267.064 | 0.9032 | 1.10 | 0.6219 | 0.4122 | 0.2562 | 0.4119 | 68.8489 | 535.0245 | 2.0293 | 11.21 | 0.1882 | 4.0923 | 0.0640 | 2.0952 | 9.2721 | 0.0584 | 0.0484 | 0.0148 | 82.8602 | 0.4958 | 0.0157 | 0.0039 | 3.1745 | 0.0584 | 0.0484 | 0.0148 | 82.8602 |
| 3 | 2008-07-19 14:43:00 | 2988.72 | 2479.90 | 2199.0333 | 909.7926 | 1.3204 | 100 | 104.2367 | 0.1217 | 1.4882 | -0.0124 | -0.0033 | 0.9629 | 201.8482 | 0 | 9.6052 | 422.2894 | 9.6924 | 0.9687 | 192.1557 | 12.4782 | 1.4011 | -5468.25 | 2648.25 | -4515.00 | -1657.25 | 1.3137 | 2.0038 | 7.3145 | 62.9333 | 2.6444 | 0.2071 | 3.3813 | 84.9105 | 8.6789 | 50.5100 | 64.1125 | 49.4900 | 65.1951 | 86.6867 | 117.0442 | 76.90 | 1.279 | 70 | 363.0273 | 9.9305 | 131.8027 | 733.8778 | 1.3027 | 142.5427 | 1 | 637.3727 | 189.9079 | 0 | 4.486 | 4.748 | 2936 | 0.9139 | 0.9447 | 4.5873 | 24.3791 | 361.4582 | 10.2112 | 116.1818 | 13.5597 | 15.6209 | 23.4736 | 710.4043 | 0.9761 | 147.6545 | 1 | 625.2945 | 70.2289 | 160.3210 | 464.9735 | 0 | -0.0555 | -0.0461 | -0.0400 | 0.0400 | 0.0676 | -0.1051 | 0.0028 | 0.0277 | 7.5925 | 0.1302 | NaN | 2.4004 | 0.9904 | 1752.0968 | 0.1958 | 8205.7000 | 0.0697 | -0.0003 | -0.0021 | -0.0001 | 0.0002 | 0.0411 | 0 | 0.0177 | -0.0195 | -0.0002 | 0.0000 | -0.0699 | -0.0059 | 0.0003 | 0.0003 | 0.0021 | -0.0483 | -0.1180 | NaN | NaN | NaN | 0.4647 | 0.9564 | 0 | 709.0867 | 0.9906 | 58.6635 | 0.6016 | 0.9761 | 6.4935 | 15.55 | 3.132 | 15.61 | 15.59 | 1.3660 | 2.480 | 0.5176 | 3.119 | 0.2838 | 0.7244 | 0.9961 | 2.3802 | 980.4510 | 41.1025 | 127 | 118.0 | 123.7 | 47.8000 | 162.4320 | 0.1915 | 0 | 5.51 | 0.0030 | 0.1140 | 0.0393 | 0.0613 | 0.0190 | 13.2651 | 0 | 9.073 | 15.241 | 1.3029 | 0.0150 | 11.9738 | 0.35 | 0.0699 | NaN | NaN | 859 | 355 | 3433 | 3004 | 0.068 | 0.108 | 0.100 | 1.7 | 0.9 | 0.086 | 0.241 | 0.9386 | 0.0356 | 0.2618 | 0.4391 | 0.2618 | 0.8567 | 0.2452 | 0.390 | 0 | 0 | 16.22 | 0.693 | 14.67 | 22.562 | 0.1786 | 5.69 | 0 | 18.20 | 52.571 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139 | 5.92 | 23.60 | 1.264 | 10.63 | 13.56 | 5.92 | 11.382 | 24.320 | 0.3458 | 9.56 | 0 | 21.97 | 104.950 | 0 | 0.1248 | 0.0463 | 0.1223 | 0.0354 | 0.0708 | 0.0754 | 0.0643 | 0.0932 | 5.5398 | 0.0023 | NaN | 0.0764 | 0.0015 | 152.0885 | 0.0573 | 820.3999 | 0 | 0.0152 | 0.0149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0067 | 0.0040 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | 0.0191 | 0.0234 | 0 | 94.0954 | 0.0010 | 3.2119 | 0.0406 | 0.0072 | 0.4212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0513 | 31.830 | 3.1959 | 33.8960 | 37.8477 | 44.3906 | 16.9347 | 50.3631 | 0.0581 | 0 | 2.1775 | 0.0007 | 0.0417 | 0.0115 | 0.0172 | 0.0063 | 4.2154 | 0 | 2.8960 | 4.0526 | 0.3882 | 0.0049 | 3.9403 | 0.0916 | 0.0245 | NaN | NaN | 415.5048 | 157.0889 | 1572.6896 | 1377.4276 | 0.0285 | 0.0445 | 0.0465 | 0.6305 | 0.3046 | 0.0286 | 0.0824 | 0.3483 | 0.0128 | 0.1004 | 0.1701 | 0.1004 | 0.3465 | 0.0973 | 0.1675 | 0 | 0 | 0 | 5.4440 | 0.2004 | 4.1900 | 6.3329 | 0.0479 | 1.7339 | 0 | 4.9660 | 15.7375 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0243 | 1.7317 | 6.6262 | 0.3512 | 3.2699 | 9.4020 | 1.7317 | 3.0672 | 6.6839 | 0.0928 | 3.0229 | 0 | 6.3292 | 29.0339 | 8.4026 | 4.8851 | 0 | 0.0407 | 0.0198 | 0.0531 | 0.0167 | 0.0224 | 0.0422 | 0.0273 | 0.0484 | 1.8222 | 0.0006 | NaN | 0.0252 | 0.0004 | 45.7058 | 0.0188 | 309.8492 | 0 | 0.0046 | 0.0049 | 0.0028 | 0.0034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0024 | 0.0014 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | 0.0038 | 0.0068 | 0 | 32.4228 | 0.0003 | 1.1135 | 0.0132 | 0.0023 | 0.1348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0155 | 13.3972 | 1.1907 | 5.6363 | 3.9482 | 4.9881 | 2.1737 | 17.8537 | 14.5054 | 0 | 5.2860 | 2.4643 | 7.6602 | 317.7362 | 0.0000 | 1.9689 | 6.5718 | 0 | 94.4594 | 3.6091 | 13.4420 | 1.5441 | 6.2313 | 2.8049 | 4.9898 | 15.7089 | 13.4051 | 76.0354 | 181.2641 | 5.1760 | 5.3899 | 1.3671 | 2.7013 | 34.0336 | 41.5236 | 7.1274 | 1.1054 | 0.4097 | 0.5183 | 0.6849 | 0.5290 | 1.3141 | 0.2829 | 0.3332 | 0 | 0 | 0 | 4.4680 | 6.9785 | 11.1303 | 3.0744 | 13.7105 | 3.9918 | 0 | 2.8555 | 27.6824 | 0 | 0 | 0 | 0 | 0 | 0 | 3.0301 | 24.2831 | 6.5291 | 12.3786 | 9.1494 | 100.0021 | 37.8979 | 48.4887 | 3.4234 | 35.4323 | 6.4746 | 0 | 3.5135 | 149.4399 | 0 | 225.0169 | 100.4883 | 305.7500 | 88.5553 | 104.6660 | 71.7583 | 0.0000 | 336.7660 | 72.9635 | 1.7670 | NaN | 3.1817 | 0.1488 | 8.6804 | 29.2542 | 9.9979 | 0 | 0.0000 | 711.6418 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 113.5593 | 0.0000 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | 4.1200 | 2.4416 | 0 | 13.2699 | 0.0977 | 5.4751 | 6.7553 | 0.7404 | 6.4865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2.1565 | 3.2465 | 7.7754 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.1650 | 372.822 | 72.442 | 1.8804 | 131.68 | 39.33 | 0.6812 | 56.9303 | 17.4781 | 161.4081 | 35.3198 | 54.2917 | 1.1613 | 0.7288 | 0.2710 | 62.7572 | 268.228 | 0.6511 | 7.32 | 0.1630 | 3.5611 | 0.0670 | 2.7290 | 25.0363 | 530.5682 | 2.0253 | 9.33 | 0.1738 | 2.8971 | 0.0525 | 1.7585 | 8.5831 | 0.0202 | 0.0149 | 0.0044 | 73.8432 | 0.4990 | 0.0103 | 0.0025 | 2.0544 | 0.0202 | 0.0149 | 0.0044 | 73.8432 |
| 4 | 2008-07-19 15:22:00 | 3032.24 | 2502.87 | 2233.3667 | 1326.5200 | 1.5334 | 100 | 100.3967 | 0.1235 | 1.5031 | -0.0031 | -0.0072 | 0.9569 | 201.9424 | 0 | 10.5661 | 420.5925 | 10.3387 | 0.9735 | 191.6037 | 12.4735 | 1.3888 | -5476.25 | 2635.25 | -3987.50 | 117.00 | 1.2887 | 1.9912 | 7.2748 | 62.8333 | 3.1556 | 0.2696 | 3.2728 | 86.3269 | 8.7677 | 50.2480 | 64.1511 | 49.7520 | 66.1542 | 86.1468 | 121.4364 | 76.39 | 2.209 | 70 | 353.3400 | 10.4091 | 176.3136 | 789.7523 | 1.0341 | 138.0882 | 1 | 667.7418 | 233.5491 | 0 | 4.624 | 4.894 | 2865 | 0.9298 | 0.9449 | 4.6414 | -12.2945 | 355.0809 | 9.7948 | 144.0191 | 21.9782 | 32.2945 | 44.1498 | 745.6025 | 0.9256 | 146.6636 | 1 | 645.7636 | 65.8417 | NaN | NaN | 0 | -0.0534 | 0.0183 | -0.0167 | -0.0449 | 0.0034 | -0.0178 | -0.0123 | -0.0048 | 7.5017 | 0.1342 | NaN | 2.4530 | 0.9902 | 1828.3846 | 0.1829 | 9014.4600 | 0.0448 | -0.0077 | -0.0001 | -0.0001 | -0.0001 | 0.2189 | 0 | -0.6704 | -0.0167 | 0.0004 | -0.0003 | 0.0696 | -0.0045 | 0.0002 | 0.0078 | 0.0000 | -0.0799 | -0.2038 | NaN | NaN | NaN | NaN | 0.9424 | 0 | 796.5950 | 0.9908 | 58.3858 | 0.5913 | 0.9628 | 6.3551 | 15.75 | 3.148 | 15.73 | 15.71 | 0.9460 | 3.027 | 0.5328 | 3.299 | -0.5677 | 0.7780 | 1.0010 | 2.3715 | 993.1274 | 38.1448 | 119 | 143.2 | 123.1 | 48.8000 | 296.3030 | 0.3744 | 0 | 3.64 | 0.0041 | 0.0634 | 0.0451 | 0.0623 | 0.0240 | 14.2354 | 0 | 9.005 | 12.506 | 0.4434 | 0.0126 | 13.9047 | 0.43 | 0.0538 | NaN | NaN | 699 | 283 | 1747 | 1443 | 0.147 | 0.040 | 0.113 | 3.9 | 0.8 | 0.101 | 0.499 | 0.5760 | 0.0631 | 0.3053 | 0.5830 | 0.3053 | 0.8285 | 0.1308 | 0.922 | 0 | 0 | 15.24 | 0.282 | 10.85 | 37.715 | 0.1189 | 3.98 | 0 | 25.54 | 72.149 | 0 | 0 | 0 | 0 | 0 | 0 | 0.250 | 5.52 | 15.76 | 0.519 | 10.71 | 19.77 | 5.52 | 8.446 | 33.832 | 0.3951 | 9.09 | 0 | 19.77 | 92.307 | 0 | 0.0915 | 0.0506 | 0.0769 | 0.1079 | 0.0797 | 0.1047 | 0.0924 | 0.1015 | 4.1338 | 0.0030 | NaN | 0.0802 | 0.0004 | 69.1510 | 0.1970 | 1406.4004 | 0 | 0.0227 | 0.0272 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0067 | 0.0031 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 0.0240 | 0 | 149.2172 | 0.0006 | 2.5775 | 0.0177 | 0.0214 | 0.4051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0488 | 19.862 | 3.6163 | 34.1250 | 55.9626 | 53.0876 | 17.4864 | 88.7672 | 0.1092 | 0 | 1.0929 | 0.0013 | 0.0257 | 0.0116 | 0.0163 | 0.0080 | 4.4239 | 0 | 3.2376 | 3.6536 | 0.1293 | 0.0040 | 4.3474 | 0.1275 | 0.0181 | NaN | NaN | 319.1252 | 128.0296 | 799.5884 | 628.3083 | 0.0755 | 0.0181 | 0.0476 | 1.3500 | 0.2698 | 0.0320 | 0.1541 | 0.2155 | 0.0310 | 0.1354 | 0.2194 | 0.1354 | 0.3072 | 0.0582 | 0.3574 | 0 | 0 | 0 | 4.8956 | 0.0766 | 2.9130 | 11.0583 | 0.0327 | 1.1229 | 0 | 7.3296 | 23.1160 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0822 | 1.6216 | 4.7279 | 0.1773 | 3.1550 | 9.7777 | 1.6216 | 2.5923 | 10.5352 | 0.1301 | 3.0939 | 0 | 6.3767 | 32.0537 | NaN | NaN | 0 | 0.0246 | 0.0221 | 0.0329 | 0.0522 | 0.0256 | 0.0545 | 0.0476 | 0.0463 | 1.5530 | 0.0010 | NaN | 0.0286 | 0.0001 | 21.0312 | 0.0573 | 494.7368 | 0 | 0.0063 | 0.0077 | 0.0052 | 0.0027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0025 | 0.0012 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 0.0089 | 0 | 57.2692 | 0.0002 | 0.8495 | 0.0065 | 0.0077 | 0.1356 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0165 | 7.1493 | 1.1704 | 5.3823 | 4.7226 | 4.9184 | 2.1850 | 22.3369 | 24.4142 | 0 | 3.6256 | 3.3208 | 4.2178 | 0.0000 | 866.0295 | 2.5046 | 7.0492 | 0 | 85.2255 | 2.9734 | 4.2892 | 1.2943 | 7.2570 | 3.4473 | 3.8754 | 12.7642 | 10.7390 | 43.8119 | 0.0000 | 11.4064 | 2.0088 | 1.5533 | 6.2069 | 25.3521 | 37.4691 | 15.2470 | 0.6672 | 0.7198 | 0.6076 | 0.9088 | 0.6136 | 1.2524 | 0.1518 | 0.7592 | 0 | 0 | 0 | 4.3131 | 2.7092 | 6.1538 | 4.7756 | 11.4945 | 2.8822 | 0 | 3.8248 | 30.8924 | 0 | 0 | 0 | 0 | 0 | 0 | 5.3863 | 44.8980 | 4.4384 | 5.2987 | 7.4365 | 89.9529 | 17.0927 | 19.1303 | 4.5375 | 42.6838 | 6.1979 | 0 | 3.0615 | 140.1953 | 0 | 171.4486 | 276.8810 | 461.8619 | 240.1781 | 0.0000 | 587.3773 | 748.1781 | 0.0000 | 55.1057 | 2.2358 | NaN | 3.2712 | 0.0372 | 3.7821 | 107.6905 | 15.6016 | 0 | 293.1396 | 0.0000 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 148.0663 | 0.0000 | 0 | 0 | 0 | 0 | NaN | NaN | NaN | NaN | 2.5512 | 0 | 18.7319 | 0.0616 | 4.4146 | 2.9954 | 2.2181 | 6.3745 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2.0579 | 1.9999 | 9.4805 | 0.1096 | 0.0078 | 0.0026 | 7.116 | 1.4636 | 399.914 | 79.156 | 1.0388 | 19.63 | 1.98 | 0.4287 | 9.7608 | 0.8311 | 70.9706 | 4.9086 | 2.5014 | 0.9778 | 0.2156 | 0.0461 | 22.0500 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | 532.0155 | 2.0275 | 8.83 | 0.2224 | 3.1776 | 0.0706 | 1.6597 | 10.9698 | NaN | NaN | NaN | NaN | 0.4800 | 0.4766 | 0.1045 | 99.3032 | 0.0202 | 0.0149 | 0.0044 | 73.8432 |
# checking if the datasets contains any NULL values
pdata.isnull().any().any(),fdata.isnull().any().any()
(True, True)
type_dct = {str(k): len(list(v)) for k, v in pdata.groupby(pdata.dtypes, axis=1)}
type_dct
{'int64': 1, 'float64': 590, 'object': 1}
type_dct = {str(k): len(list(v)) for k, v in fdata.groupby(fdata.dtypes, axis=1)}
type_dct
{'datetime64[ns]': 1, 'int64': 132, 'float64': 458}
2. Data cleansing:
• Missing value treatment.
• Drop attribute/s if required using relevant functional knowledge.
• Make all relevant modifications on the data using both functional/logical reasoning/assumptions.
3. Data analysis & visualisation:
• Perform detailed relevant statistical analysis on the data.
• Perform a detailed univariate, bivariate and multivariate analysis with appropriate detailed comments after each analysis.
NOTE: Both Steps 2 and 3 are performed together in the following code, since the data is dominantly numeric in nature.
pdata.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 1567 entries, 0 to 1566 Columns: 592 entries, Time to Pass/Fail dtypes: float64(590), int64(1), object(1) memory usage: 7.1+ MB
np.count_nonzero(pdata.isna().sum())
538
So, 538 columns contain missing values. let's see the highest amounts of missing values in a column:
pdata.isna().sum().sort_values(ascending=False).head(10)
158 1429 292 1429 293 1429 157 1429 85 1341 492 1341 220 1341 358 1341 517 1018 245 1018 dtype: int64
fdata.isna().sum().sort_values(ascending=False).head(10)
85 18 517 18 245 18 246 18 358 18 492 18 109 18 292 18 293 18 110 18 dtype: int64
# function to show null values with percentage
def null_values(pdata):
nv=pd.concat([pdata.isnull().sum(), 100 * pdata.isnull().sum()/pdata.shape[0]],axis=1).rename(columns={0:'Missing_Records', 1:'Percentage (%)'})
return nv[nv.Missing_Records>0].sort_values('Missing_Records', ascending=False)
df_na = null_values(pdata)
df_na.head(20).T
| 292 | 293 | 157 | 158 | 358 | 85 | 492 | 220 | 518 | 246 | 245 | 516 | 517 | 110 | 384 | 382 | 383 | 109 | 244 | 111 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Missing_Records | 1429.000000 | 1429.000000 | 1429.000000 | 1429.000000 | 1341.000000 | 1341.000000 | 1341.000000 | 1341.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 | 1018.000000 |
| Percentage (%) | 91.193363 | 91.193363 | 91.193363 | 91.193363 | 85.577537 | 85.577537 | 85.577537 | 85.577537 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 | 64.964901 |
df=pdata.isna().sum()*100/pdata.shape[0]
fig = px.line(x=df.index, y=df,title="Percentage of missing values in all the features")
fig.update_xaxes(title_text= 'Features')
fig.update_yaxes(title_text= 'Percentage of Missing values',range=[0,100])
fig.show()
df=pdata.isna().sum()*100/pdata.shape[0]
df = df[df>5].sort_values(ascending=False)
fig = px.bar(x=df.index,
y = df,
title='Percentage of missing values per feature (with >5% NaNs)',
text = round(df,1))
fig.update_xaxes(title_text='Features with more than 5% missing value (sorted)',type='category')
fig.update_yaxes(title_text='Percentage of missing values')
fig.show()
Remark: There is big jump after 17.4% to 45.6%. Generally features with more than 35% missing data, do nto offer much value in prediction
df=(pdata == 0).sum()*100/pdata.shape[0]
fig = px.line(x=df.index, y=df,title="Percentage of zeros in all the features")
fig.update_xaxes(title_text= 'Features')
fig.update_yaxes(title_text= 'Percentage of zeros',range=[0,100])
fig.show()
Large number of zeros are present. Many features have only 1 value, i.e. 0 throughout
df = pd.cut(pdata.var().round(2),[-0.1,0,0.1,0.2,1,10,50,100,500,1000,float('inf')]).value_counts().sort_index()
df.index = df.index.map(str)
fig = px.bar(x=df.index, y=df,title="variance (rounded off to 2 decimal places) vs number of features", text = df)
fig.update_xaxes(title='variance intervals')
fig.update_yaxes(title='Number of features')
fig.show()
More than 250 features have extremely low variance (<0.1), thus having minimal contribution in the output
# Collect features with missing values more than 30%
df = pdata.isna().sum()*100/pdata.shape[0]
missing_features = df[df>30].index.tolist()
# Collect features with variance less than or equal to 0.1
df = pdata.drop('Pass/Fail',1).var().round(2)
low_var_features = df[df<=0.1].index.tolist()
# combine the list and remove them frm the main dataset
pdata2 = pdata.drop(np.unique(low_var_features + missing_features).tolist(),1)
print(f'There are {pdata2.shape[0]} rows and {pdata2.shape[1]} columns.')
print("The duplicate data set has",len(pdata2.columns),"columns &",len(pdata.columns)-len(pdata2.columns),"were removed.")
print(f'Features left: {round(pdata2.shape[1]*100/pdata.shape[1],2)}%\n')
pdata2.head()
There are 1567 rows and 256 columns. The duplicate data set has 256 columns & 336 were removed. Features left: 43.24%
| Time | 0 | 1 | 2 | 3 | 4 | 6 | 12 | 14 | 15 | 16 | 18 | 21 | 22 | 23 | 24 | 27 | 28 | 29 | 31 | 32 | 33 | 34 | 35 | 36 | 38 | 39 | 40 | 41 | 43 | 45 | 46 | 48 | 50 | 51 | 55 | 59 | 60 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 70 | 71 | 83 | 88 | 90 | 98 | 115 | 117 | 122 | 129 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 142 | 148 | 150 | 151 | 152 | 154 | 155 | 159 | 160 | 161 | 162 | 165 | 166 | 167 | 180 | 182 | 183 | 185 | 187 | 188 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 207 | 208 | 209 | 218 | 223 | 225 | 250 | 252 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 277 | 283 | 285 | 286 | 287 | 289 | 294 | 295 | 296 | 297 | 301 | 316 | 318 | 319 | 321 | 323 | 324 | 332 | 333 | 335 | 336 | 337 | 338 | 339 | 340 | 341 | 343 | 344 | 347 | 356 | 361 | 363 | 388 | 390 | 406 | 407 | 408 | 409 | 410 | 411 | 412 | 413 | 415 | 416 | 417 | 418 | 419 | 420 | 421 | 423 | 424 | 425 | 426 | 427 | 428 | 429 | 430 | 431 | 432 | 433 | 434 | 435 | 436 | 437 | 438 | 439 | 440 | 442 | 452 | 453 | 454 | 455 | 456 | 457 | 459 | 460 | 467 | 468 | 469 | 470 | 471 | 472 | 473 | 474 | 475 | 476 | 477 | 478 | 479 | 480 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 490 | 491 | 493 | 494 | 495 | 496 | 497 | 499 | 500 | 510 | 511 | 520 | 521 | 522 | 523 | 524 | 525 | 526 | 527 | 539 | 540 | 541 | 545 | 546 | 547 | 548 | 549 | 550 | 551 | 553 | 554 | 555 | 556 | 557 | 561 | 562 | 564 | 566 | 568 | 569 | 570 | 572 | 574 | 576 | 577 | 585 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2008-07-19 11:55:00 | 3030.93 | 2564.00 | 2187.7333 | 1411.1265 | 1.3602 | 97.6133 | 202.4396 | 7.9558 | 414.8710 | 10.0433 | 192.3963 | -5419.00 | 2916.50 | -4043.75 | 751.00 | 3.0490 | 64.2333 | 2.0222 | 3.5191 | 83.3971 | 9.5126 | 50.6170 | 64.2588 | 49.3830 | 86.9555 | 117.5132 | 61.29 | 4.515 | 352.7173 | 130.3691 | 723.3092 | 141.2282 | 624.3145 | 218.3174 | 2834.0 | -1.7264 | 350.9264 | 108.6427 | 16.1445 | 21.7264 | 29.5367 | 693.7724 | 0.9226 | 148.6009 | 608.1700 | 84.0793 | 7.2163 | 1747.6049 | 8671.9301 | 0.3974 | 748.6115 | 58.4306 | 2.639 | -0.0473 | 1000.7263 | 39.2373 | 123.0 | 111.3 | 75.2 | 46.2000 | 350.6710 | 0.3948 | 6.78 | 12.2566 | 4.271 | 10.284 | 0.4734 | 11.8901 | 0.41 | 1017.0 | 967.0 | 1066.0 | 368.0 | 0.095 | 2.0 | 0.9 | 20.95 | 12.49 | 16.713 | 5.72 | 11.19 | 65.363 | 0.292 | 5.38 | 20.10 | 0.296 | 10.62 | 10.30 | 5.38 | 4.040 | 16.230 | 0.2951 | 8.64 | 10.30 | 97.314 | 0.0 | 3.4789 | 175.2173 | 1940.3994 | 219.9453 | 2.8374 | 40.855 | 4.5152 | 30.9815 | 33.9606 | 22.9057 | 15.9525 | 110.2144 | 0.1310 | 2.5883 | 3.9321 | 1.5123 | 3.5811 | 0.1337 | 3.8447 | 418.1363 | 398.3185 | 496.1582 | 158.3330 | 0.6083 | 6.2698 | 3.8208 | 5.3737 | 1.6252 | 3.2461 | 18.0118 | 1.5989 | 6.5893 | 3.0911 | 8.4654 | 1.5989 | 1.2293 | 5.3406 | 0.0867 | 2.8551 | 2.9971 | 31.8843 | 0.0 | 1.2708 | 55.2039 | 560.2658 | 61.5932 | 0.9967 | 14.2396 | 1.4392 | 5.6188 | 3.6721 | 2.9329 | 2.1118 | 24.8504 | 29.0271 | 6.9458 | 2.7380 | 5.9846 | 525.0965 | 0.0000 | 3.4641 | 6.0544 | 53.6840 | 2.4788 | 4.7141 | 1.7275 | 6.1800 | 3.2750 | 3.6084 | 18.7673 | 33.1562 | 26.3617 | 49.0013 | 10.0503 | 2.7073 | 3.1158 | 3.1136 | 44.5055 | 42.2737 | 1.3071 | 1.1975 | 5.9396 | 3.2698 | 9.5805 | 2.3106 | 6.1463 | 4.0502 | 1.7924 | 29.9394 | 6.2052 | 311.6377 | 5.7277 | 2.7864 | 9.7752 | 63.7987 | 24.7625 | 13.6778 | 2.3394 | 31.9893 | 5.8142 | 0.0 | 1.6936 | 115.7408 | 613.3069 | 291.4842 | 494.6996 | 178.1759 | 843.1138 | 0.0000 | 53.1098 | 0.0000 | 48.2091 | 0.7578 | 2.9570 | 2.1739 | 10.0261 | 17.1202 | 22.3756 | 0.0000 | 0.0000 | 64.6707 | 0.0000 | 1.9864 | 0.0 | 29.3804 | 0.1094 | 4.8560 | 3.1406 | 0.5064 | 6.6926 | 2.0570 | 4.0825 | 11.5074 | 7.116 | 1.0616 | 395.570 | 75.752 | 0.4234 | 12.93 | 0.78 | 5.7349 | 0.3363 | 39.8842 | 3.2687 | 1.0297 | 42.3877 | NaN | NaN | NaN | NaN | NaN | 533.8500 | 8.95 | 3.0624 | 1.6765 | 14.9509 | 2.3630 | NaN | -1 |
| 1 | 2008-07-19 12:32:00 | 3095.78 | 2465.14 | 2230.4222 | 1463.6606 | 0.8294 | 102.3433 | 200.5470 | 10.1548 | 414.7347 | 9.2599 | 191.2872 | -5441.50 | 2604.25 | -3498.75 | -1640.25 | 7.3900 | 68.4222 | 2.2667 | 3.4171 | 84.9052 | 9.7997 | 50.6596 | 64.2828 | 49.3404 | 87.5241 | 118.1188 | 78.25 | 2.773 | 352.2445 | 133.1727 | 724.8264 | 145.8445 | 631.2618 | 205.1695 | 2853.0 | 0.8073 | 352.0073 | 113.9800 | 10.9036 | 19.1927 | 27.6301 | 697.1964 | 1.1598 | 154.3709 | 620.3582 | 82.3494 | 6.8043 | 1931.6464 | 8407.0299 | -0.9353 | 731.2517 | 58.6680 | 2.541 | -0.0946 | 998.1081 | 37.9213 | 98.0 | 80.3 | 81.0 | 56.2000 | 219.7679 | 0.2301 | 5.70 | 12.3319 | 6.285 | 13.077 | 0.5666 | 11.8428 | 0.35 | 568.0 | 59.0 | 297.0 | 3277.0 | 0.124 | 2.2 | 1.1 | 17.99 | 10.14 | 16.358 | 6.92 | 9.05 | 82.986 | 0.222 | 3.74 | 19.59 | 0.316 | 11.65 | 8.02 | 3.74 | 3.659 | 15.078 | 0.3580 | 8.96 | 8.02 | 134.250 | 0.0 | 3.9578 | 128.4285 | 1988.0000 | 193.0287 | 3.8999 | 29.743 | 3.6327 | 29.0598 | 28.9862 | 22.3163 | 17.4008 | 83.5542 | 0.0767 | 1.8459 | 3.9011 | 2.1016 | 3.9483 | 0.1662 | 3.7836 | 233.9865 | 26.5879 | 139.2082 | 1529.7622 | 0.8151 | 5.6522 | 2.9939 | 5.2445 | 1.8045 | 2.7661 | 23.6230 | 1.1506 | 5.9247 | 3.3604 | 7.7421 | 1.1506 | 1.1265 | 5.0108 | 0.1013 | 2.4278 | 2.4890 | 41.7080 | 0.0 | 1.2474 | 46.3453 | 677.1873 | 65.0999 | 1.1655 | 10.5837 | 1.0323 | 4.3465 | 2.5939 | 3.2858 | 2.5197 | 15.0150 | 27.7464 | 5.5695 | 3.9300 | 9.0604 | 0.0000 | 368.9713 | 2.1196 | 6.1491 | 61.8918 | 3.1531 | 6.1188 | 1.4857 | 6.1911 | 2.8088 | 3.1595 | 10.4383 | 2.2655 | 8.4887 | 199.7866 | 8.6336 | 5.7093 | 1.6779 | 3.2153 | 48.5294 | 37.5793 | 16.4174 | 1.9562 | 5.1072 | 4.3737 | 7.6142 | 2.2568 | 6.9233 | 4.7448 | 1.4336 | 40.4475 | 4.7415 | 463.2883 | 5.5652 | 3.0652 | 10.2211 | 73.5536 | 19.4865 | 13.2430 | 2.1627 | 30.8643 | 5.8042 | 0.0 | 1.2928 | 163.0249 | 0.0000 | 246.7762 | 0.0000 | 359.0444 | 130.6350 | 820.7900 | 194.4371 | 0.0000 | 58.1666 | 3.6822 | 3.2029 | 0.1441 | 6.6487 | 12.6788 | 23.6469 | 0.0000 | 0.0000 | 141.4365 | 0.0000 | 1.6292 | 0.0 | 26.3970 | 0.0673 | 6.6475 | 3.1310 | 0.8832 | 8.8370 | 1.7910 | 2.9799 | 9.5796 | 7.116 | 1.3526 | 408.798 | 74.640 | 0.7193 | 16.00 | 1.33 | 7.1196 | 0.4989 | 53.1836 | 3.9139 | 1.7819 | 18.1087 | NaN | NaN | NaN | NaN | NaN | 535.0164 | 5.92 | 2.0111 | 1.1065 | 10.9003 | 4.4447 | 208.2045 | -1 |
| 2 | 2008-07-19 13:17:00 | 2932.61 | 2559.94 | 2186.4111 | 1698.0172 | 1.5102 | 95.4878 | 202.0179 | 9.5157 | 416.7075 | 9.3144 | 192.7035 | -5447.75 | 2701.75 | -4047.00 | -1916.50 | 7.5788 | 67.1333 | 2.3333 | 3.5986 | 84.7569 | 8.6590 | 50.1530 | 64.1114 | 49.8470 | 84.7327 | 118.6128 | 14.37 | 5.434 | 364.3782 | 131.8027 | 734.7924 | 141.0845 | 637.2655 | 185.7574 | 2936.0 | 23.8245 | 364.5364 | 115.6273 | 11.3019 | 16.1755 | 24.2829 | 710.5095 | 0.8694 | 145.8000 | 625.9636 | 84.7681 | 7.1041 | 1685.8514 | 9317.1698 | -0.1427 | 718.5777 | 58.4808 | 2.882 | -0.1892 | 998.4440 | 42.0579 | 89.0 | 126.4 | 96.5 | 45.1001 | 306.0380 | 0.3263 | 8.33 | 8.2660 | 4.819 | 8.443 | 0.4909 | 8.2054 | 0.47 | 562.0 | 788.0 | 759.0 | 2100.0 | 0.068 | 2.1 | 1.4 | 17.78 | 13.31 | 22.912 | 9.21 | 17.87 | 60.110 | 0.139 | 5.09 | 19.75 | 0.949 | 9.71 | 16.73 | 5.09 | 11.059 | 22.624 | 0.1164 | 13.30 | 16.73 | 79.618 | 0.0 | 2.4266 | 182.4956 | 839.6006 | 104.4042 | 4.1446 | 29.621 | 3.9133 | 23.5510 | 41.3837 | 32.6256 | 15.7716 | 97.3868 | 0.1117 | 2.5274 | 2.8705 | 1.5306 | 2.5493 | 0.1479 | 2.8046 | 251.4536 | 329.6406 | 325.0672 | 902.4576 | 0.6964 | 5.7247 | 3.8541 | 6.1797 | 2.5680 | 4.6067 | 16.0104 | 1.5481 | 5.9453 | 3.1600 | 8.9855 | 1.5481 | 2.9844 | 6.2277 | 0.0353 | 3.7663 | 5.6983 | 24.7959 | 0.0 | 0.7786 | 58.0575 | 283.6616 | 28.7334 | 1.2356 | 11.4871 | 1.1798 | 4.0782 | 4.3102 | 3.7696 | 2.0627 | 18.0233 | 21.6062 | 8.7236 | 3.0609 | 5.2231 | 0.0000 | 0.0000 | 2.2943 | 4.0917 | 50.6425 | 2.0261 | 5.2707 | 1.8268 | 4.2581 | 3.7479 | 3.5220 | 10.3162 | 29.1663 | 18.7546 | 109.5747 | 14.2503 | 5.7650 | 0.8972 | 3.1281 | 60.0000 | 70.9161 | 8.8647 | 0.4264 | 4.8795 | 7.5418 | 10.0984 | 3.1182 | 15.0790 | 6.5280 | 2.8042 | 32.3594 | 3.0301 | 21.3645 | 5.4178 | 9.3327 | 8.3977 | 148.0287 | 31.4674 | 45.5423 | 3.1842 | 13.3923 | 9.1221 | 0.0 | 2.6727 | 93.9245 | 434.2674 | 151.7665 | 0.0000 | 190.3869 | 746.9150 | 74.0741 | 191.7582 | 250.1742 | 34.1573 | 1.0281 | 3.9238 | 1.5357 | 10.8251 | 18.9849 | 9.0113 | 0.0000 | 0.0000 | 240.7767 | 244.2748 | 2.9626 | 0.0 | 14.5293 | 0.0751 | 7.0870 | 12.1831 | 0.6451 | 6.4568 | 2.1538 | 2.9667 | 9.3046 | 7.116 | 0.7942 | 411.136 | 74.654 | 0.1832 | 16.16 | 0.85 | 7.1619 | 0.3752 | 23.0713 | 3.9306 | 1.1386 | 24.7524 | 267.064 | 1.10 | 0.4122 | 0.4119 | 68.8489 | 535.0245 | 11.21 | 4.0923 | 2.0952 | 9.2721 | 3.1745 | 82.8602 | 1 |
| 3 | 2008-07-19 14:43:00 | 2988.72 | 2479.90 | 2199.0333 | 909.7926 | 1.3204 | 104.2367 | 201.8482 | 9.6052 | 422.2894 | 9.6924 | 192.1557 | -5468.25 | 2648.25 | -4515.00 | -1657.25 | 7.3145 | 62.9333 | 2.6444 | 3.3813 | 84.9105 | 8.6789 | 50.5100 | 64.1125 | 49.4900 | 86.6867 | 117.0442 | 76.90 | 1.279 | 363.0273 | 131.8027 | 733.8778 | 142.5427 | 637.3727 | 189.9079 | 2936.0 | 24.3791 | 361.4582 | 116.1818 | 13.5597 | 15.6209 | 23.4736 | 710.4043 | 0.9761 | 147.6545 | 625.2945 | 70.2289 | 7.5925 | 1752.0968 | 8205.7000 | 0.0177 | 709.0867 | 58.6635 | 3.132 | 0.2838 | 980.4510 | 41.1025 | 127.0 | 118.0 | 123.7 | 47.8000 | 162.4320 | 0.1915 | 5.51 | 13.2651 | 9.073 | 15.241 | 1.3029 | 11.9738 | 0.35 | 859.0 | 355.0 | 3433.0 | 3004.0 | 0.100 | 1.7 | 0.9 | 16.22 | 14.67 | 22.562 | 5.69 | 18.20 | 52.571 | 0.139 | 5.92 | 23.60 | 1.264 | 10.63 | 13.56 | 5.92 | 11.382 | 24.320 | 0.3458 | 9.56 | 21.97 | 104.950 | 0.0 | 5.5398 | 152.0885 | 820.3999 | 94.0954 | 3.2119 | 31.830 | 3.1959 | 33.8960 | 37.8477 | 44.3906 | 16.9347 | 50.3631 | 0.0581 | 2.1775 | 4.2154 | 2.8960 | 4.0526 | 0.3882 | 3.9403 | 415.5048 | 157.0889 | 1572.6896 | 1377.4276 | 0.6305 | 5.4440 | 4.1900 | 6.3329 | 1.7339 | 4.9660 | 15.7375 | 1.7317 | 6.6262 | 3.2699 | 9.4020 | 1.7317 | 3.0672 | 6.6839 | 0.0928 | 3.0229 | 6.3292 | 29.0339 | 0.0 | 1.8222 | 45.7058 | 309.8492 | 32.4228 | 1.1135 | 13.3972 | 1.1907 | 5.6363 | 3.9482 | 4.9881 | 2.1737 | 17.8537 | 14.5054 | 5.2860 | 2.4643 | 7.6602 | 317.7362 | 0.0000 | 1.9689 | 6.5718 | 94.4594 | 3.6091 | 13.4420 | 1.5441 | 6.2313 | 2.8049 | 4.9898 | 15.7089 | 13.4051 | 76.0354 | 181.2641 | 5.1760 | 5.3899 | 1.3671 | 2.7013 | 34.0336 | 41.5236 | 7.1274 | 0.4097 | 4.4680 | 6.9785 | 11.1303 | 3.0744 | 13.7105 | 3.9918 | 2.8555 | 27.6824 | 3.0301 | 24.2831 | 6.5291 | 12.3786 | 9.1494 | 100.0021 | 37.8979 | 48.4887 | 3.4234 | 35.4323 | 6.4746 | 0.0 | 3.5135 | 149.4399 | 225.0169 | 100.4883 | 305.7500 | 88.5553 | 104.6660 | 71.7583 | 0.0000 | 336.7660 | 72.9635 | 1.7670 | 3.1817 | 0.1488 | 8.6804 | 29.2542 | 9.9979 | 0.0000 | 711.6418 | 113.5593 | 0.0000 | 2.4416 | 0.0 | 13.2699 | 0.0977 | 5.4751 | 6.7553 | 0.7404 | 6.4865 | 2.1565 | 3.2465 | 7.7754 | 7.116 | 1.1650 | 372.822 | 72.442 | 1.8804 | 131.68 | 39.33 | 56.9303 | 17.4781 | 161.4081 | 35.3198 | 54.2917 | 62.7572 | 268.228 | 7.32 | 3.5611 | 2.7290 | 25.0363 | 530.5682 | 9.33 | 2.8971 | 1.7585 | 8.5831 | 2.0544 | 73.8432 | -1 |
| 4 | 2008-07-19 15:22:00 | 3032.24 | 2502.87 | 2233.3667 | 1326.5200 | 1.5334 | 100.3967 | 201.9424 | 10.5661 | 420.5925 | 10.3387 | 191.6037 | -5476.25 | 2635.25 | -3987.50 | 117.00 | 7.2748 | 62.8333 | 3.1556 | 3.2728 | 86.3269 | 8.7677 | 50.2480 | 64.1511 | 49.7520 | 86.1468 | 121.4364 | 76.39 | 2.209 | 353.3400 | 176.3136 | 789.7523 | 138.0882 | 667.7418 | 233.5491 | 2865.0 | -12.2945 | 355.0809 | 144.0191 | 21.9782 | 32.2945 | 44.1498 | 745.6025 | 0.9256 | 146.6636 | 645.7636 | 65.8417 | 7.5017 | 1828.3846 | 9014.4600 | -0.6704 | 796.5950 | 58.3858 | 3.148 | -0.5677 | 993.1274 | 38.1448 | 119.0 | 143.2 | 123.1 | 48.8000 | 296.3030 | 0.3744 | 3.64 | 14.2354 | 9.005 | 12.506 | 0.4434 | 13.9047 | 0.43 | 699.0 | 283.0 | 1747.0 | 1443.0 | 0.113 | 3.9 | 0.8 | 15.24 | 10.85 | 37.715 | 3.98 | 25.54 | 72.149 | 0.250 | 5.52 | 15.76 | 0.519 | 10.71 | 19.77 | 5.52 | 8.446 | 33.832 | 0.3951 | 9.09 | 19.77 | 92.307 | 0.0 | 4.1338 | 69.1510 | 1406.4004 | 149.2172 | 2.5775 | 19.862 | 3.6163 | 34.1250 | 55.9626 | 53.0876 | 17.4864 | 88.7672 | 0.1092 | 1.0929 | 4.4239 | 3.2376 | 3.6536 | 0.1293 | 4.3474 | 319.1252 | 128.0296 | 799.5884 | 628.3083 | 1.3500 | 4.8956 | 2.9130 | 11.0583 | 1.1229 | 7.3296 | 23.1160 | 1.6216 | 4.7279 | 3.1550 | 9.7777 | 1.6216 | 2.5923 | 10.5352 | 0.1301 | 3.0939 | 6.3767 | 32.0537 | 0.0 | 1.5530 | 21.0312 | 494.7368 | 57.2692 | 0.8495 | 7.1493 | 1.1704 | 5.3823 | 4.7226 | 4.9184 | 2.1850 | 22.3369 | 24.4142 | 3.6256 | 3.3208 | 4.2178 | 0.0000 | 866.0295 | 2.5046 | 7.0492 | 85.2255 | 2.9734 | 4.2892 | 1.2943 | 7.2570 | 3.4473 | 3.8754 | 12.7642 | 10.7390 | 43.8119 | 0.0000 | 11.4064 | 2.0088 | 1.5533 | 6.2069 | 25.3521 | 37.4691 | 15.2470 | 0.7198 | 4.3131 | 2.7092 | 6.1538 | 4.7756 | 11.4945 | 2.8822 | 3.8248 | 30.8924 | 5.3863 | 44.8980 | 4.4384 | 5.2987 | 7.4365 | 89.9529 | 17.0927 | 19.1303 | 4.5375 | 42.6838 | 6.1979 | 0.0 | 3.0615 | 140.1953 | 171.4486 | 276.8810 | 461.8619 | 240.1781 | 0.0000 | 587.3773 | 748.1781 | 0.0000 | 55.1057 | 2.2358 | 3.2712 | 0.0372 | 3.7821 | 107.6905 | 15.6016 | 293.1396 | 0.0000 | 148.0663 | 0.0000 | 2.5512 | 0.0 | 18.7319 | 0.0616 | 4.4146 | 2.9954 | 2.2181 | 6.3745 | 2.0579 | 1.9999 | 9.4805 | 7.116 | 1.4636 | 399.914 | 79.156 | 1.0388 | 19.63 | 1.98 | 9.7608 | 0.8311 | 70.9706 | 4.9086 | 2.5014 | 22.0500 | NaN | NaN | NaN | NaN | NaN | 532.0155 | 8.83 | 3.1776 | 1.6597 | 10.9698 | 99.3032 | 73.8432 | -1 |
pdata2.isna().sum().sort_values(ascending=False).head(10)
564 273 562 273 568 273 569 273 566 273 548 260 550 260 557 260 556 260 555 260 dtype: int64
There are still many columns with null values, we will see more information to remove before dropping all the null values.
#Remove the highly collinear features from data
'''
Objective:
Remove collinear features in a dataframe with a correlation coefficient
greater than the threshold. Removing collinear features can help a model
to generalize and improves the interpretability of the model.
Inputs:
x: features dataframe
threshold: features with correlations greater than this value are removed
Output:
dataframe that contains only the non-highly-collinear features
'''
def remove_collinear_features(x, threshold):
# Calculate the correlation matrix
corr_matrix = x.corr()
iters = range(len(corr_matrix.columns) - 1)
drop_cols = []
# Iterate through the correlation matrix and compare correlations
for i in iters:
for j in range(i+1):
item = corr_matrix.iloc[j:(j+1), (i+1):(i+2)]
col = item.columns
row = item.index
val = abs(item.values)
# If correlation exceeds the threshold
if val >= threshold:
# Print the correlated features and the correlation value
print(col.values[0], "|", row.values[0], "|", round(val[0][0], 2))
drop_cols.append(col.values[0])
# Drop one of each pair of correlated columns
drops = set(drop_cols)
x = x.drop(columns=drops)
return x
#Remove columns having more than 70% correlation
#Both positive and negative correlations are considered here
pdata3 = remove_collinear_features(pdata2,0.70)
22 | 21 | 0.73 34 | 32 | 0.75 35 | 34 | 0.77 36 | 32 | 0.75 36 | 34 | 1.0 36 | 35 | 0.77 39 | 34 | 0.8 39 | 36 | 0.8 46 | 45 | 0.81 50 | 46 | 0.9 60 | 43 | 0.9 60 | 50 | 0.7 65 | 64 | 0.84 66 | 46 | 0.82 66 | 50 | 0.76 70 | 46 | 0.8 70 | 50 | 0.77 70 | 62 | 0.73 70 | 66 | 0.9 140 | 4 | 1.0 148 | 16 | 0.97 152 | 16 | 0.98 152 | 148 | 0.99 154 | 16 | 0.87 154 | 148 | 0.94 154 | 152 | 0.89 165 | 159 | 0.79 187 | 185 | 0.83 196 | 67 | 0.86 197 | 67 | 0.86 197 | 196 | 0.9 198 | 196 | 0.7 198 | 197 | 0.72 199 | 67 | 0.81 199 | 196 | 0.94 199 | 197 | 0.83 199 | 198 | 0.71 202 | 201 | 0.8 203 | 196 | 0.81 203 | 197 | 0.71 203 | 199 | 0.8 203 | 200 | 0.76 203 | 202 | 0.84 204 | 67 | 0.9 204 | 196 | 0.87 204 | 197 | 0.82 204 | 199 | 0.83 204 | 203 | 0.8 205 | 67 | 0.87 205 | 196 | 0.86 205 | 197 | 0.84 205 | 198 | 0.71 205 | 199 | 0.8 205 | 204 | 0.83 207 | 67 | 0.86 207 | 196 | 0.92 207 | 197 | 0.87 207 | 199 | 0.88 207 | 200 | 0.71 207 | 203 | 0.86 207 | 204 | 0.87 207 | 205 | 0.87 209 | 202 | 0.83 252 | 117 | 0.99 270 | 135 | 0.95 271 | 136 | 0.97 272 | 137 | 0.98 272 | 271 | 0.71 273 | 138 | 0.92 274 | 139 | 0.99 275 | 4 | 1.0 275 | 140 | 1.0 277 | 4 | 0.72 277 | 140 | 0.72 277 | 142 | 0.97 277 | 275 | 0.72 283 | 16 | 0.97 283 | 148 | 1.0 283 | 152 | 0.99 283 | 154 | 0.94 285 | 150 | 0.97 286 | 151 | 0.99 287 | 16 | 0.98 287 | 148 | 0.99 287 | 152 | 1.0 287 | 154 | 0.89 287 | 283 | 0.99 289 | 16 | 0.88 289 | 148 | 0.94 289 | 152 | 0.89 289 | 154 | 0.99 289 | 283 | 0.94 289 | 287 | 0.89 294 | 159 | 0.99 294 | 165 | 0.82 295 | 160 | 1.0 296 | 161 | 0.99 297 | 162 | 0.99 301 | 166 | 0.96 316 | 180 | 0.88 318 | 182 | 0.98 319 | 183 | 0.98 321 | 185 | 0.99 321 | 187 | 0.83 323 | 185 | 0.82 323 | 187 | 0.99 323 | 321 | 0.82 324 | 188 | 0.98 332 | 67 | 0.88 332 | 196 | 0.96 332 | 197 | 0.91 332 | 199 | 0.9 332 | 203 | 0.72 332 | 204 | 0.82 332 | 205 | 0.89 332 | 207 | 0.91 333 | 67 | 0.87 333 | 196 | 0.87 333 | 197 | 0.98 333 | 198 | 0.74 333 | 199 | 0.8 333 | 204 | 0.79 333 | 205 | 0.85 333 | 207 | 0.87 333 | 332 | 0.9 335 | 67 | 0.85 335 | 196 | 0.93 335 | 197 | 0.86 335 | 199 | 0.96 335 | 203 | 0.72 335 | 204 | 0.79 335 | 205 | 0.85 335 | 207 | 0.9 335 | 332 | 0.96 335 | 333 | 0.86 336 | 67 | 0.87 336 | 196 | 0.91 336 | 197 | 0.9 336 | 198 | 0.7 336 | 199 | 0.88 336 | 203 | 0.71 336 | 204 | 0.82 336 | 205 | 0.88 336 | 207 | 0.9 336 | 332 | 0.94 336 | 333 | 0.9 336 | 335 | 0.93 337 | 201 | 0.93 337 | 202 | 0.81 338 | 201 | 0.75 338 | 202 | 0.99 338 | 203 | 0.86 338 | 204 | 0.7 338 | 209 | 0.87 338 | 337 | 0.76 339 | 196 | 0.76 339 | 199 | 0.76 339 | 200 | 0.73 339 | 202 | 0.88 339 | 203 | 0.98 339 | 204 | 0.75 339 | 207 | 0.82 339 | 209 | 0.78 339 | 338 | 0.91 340 | 67 | 0.95 340 | 196 | 0.85 340 | 197 | 0.83 340 | 199 | 0.81 340 | 203 | 0.72 340 | 204 | 0.99 340 | 205 | 0.84 340 | 207 | 0.85 340 | 332 | 0.82 340 | 333 | 0.82 340 | 335 | 0.79 340 | 336 | 0.82 341 | 67 | 0.91 341 | 196 | 0.86 341 | 197 | 0.85 341 | 198 | 0.72 341 | 199 | 0.81 341 | 204 | 0.87 341 | 205 | 0.99 341 | 207 | 0.87 341 | 332 | 0.89 341 | 333 | 0.86 341 | 335 | 0.85 341 | 336 | 0.89 341 | 340 | 0.89 343 | 67 | 0.87 343 | 196 | 0.9 343 | 197 | 0.88 343 | 199 | 0.87 343 | 203 | 0.8 343 | 204 | 0.83 343 | 205 | 0.88 343 | 207 | 0.98 343 | 332 | 0.94 343 | 333 | 0.89 343 | 335 | 0.92 343 | 336 | 0.92 343 | 339 | 0.75 343 | 340 | 0.82 343 | 341 | 0.88 344 | 208 | 0.96 347 | 202 | 0.83 347 | 209 | 1.0 347 | 338 | 0.87 347 | 339 | 0.78 356 | 218 | 0.95 361 | 223 | 0.98 363 | 225 | 0.96 388 | 250 | 0.97 388 | 363 | 0.7 390 | 117 | 0.99 390 | 252 | 1.0 406 | 268 | 0.97 407 | 269 | 0.95 408 | 135 | 1.0 408 | 270 | 0.94 409 | 136 | 1.0 409 | 271 | 0.97 410 | 137 | 1.0 410 | 272 | 0.97 411 | 138 | 1.0 411 | 273 | 0.92 412 | 139 | 0.85 412 | 274 | 0.82 413 | 4 | 0.94 413 | 140 | 0.94 413 | 275 | 0.94 415 | 142 | 0.99 415 | 277 | 0.97 420 | 16 | 0.9 420 | 148 | 0.9 420 | 152 | 0.91 420 | 154 | 0.81 420 | 283 | 0.9 420 | 287 | 0.91 420 | 289 | 0.82 421 | 16 | 0.96 421 | 148 | 1.0 421 | 152 | 0.98 421 | 154 | 0.95 421 | 283 | 1.0 421 | 287 | 0.98 421 | 289 | 0.95 421 | 420 | 0.9 424 | 151 | 0.98 424 | 286 | 0.97 425 | 16 | 0.94 425 | 148 | 0.96 425 | 152 | 0.98 425 | 154 | 0.86 425 | 283 | 0.96 425 | 287 | 0.97 425 | 289 | 0.86 425 | 420 | 0.88 425 | 421 | 0.95 427 | 16 | 0.89 427 | 148 | 0.95 427 | 152 | 0.91 427 | 154 | 1.0 427 | 283 | 0.95 427 | 287 | 0.91 427 | 289 | 0.99 427 | 420 | 0.83 427 | 421 | 0.97 427 | 425 | 0.88 428 | 155 | 1.0 430 | 159 | 0.87 430 | 165 | 0.85 430 | 294 | 0.89 431 | 160 | 0.81 431 | 165 | 0.81 431 | 294 | 0.72 431 | 295 | 0.83 431 | 430 | 0.9 434 | 159 | 0.71 434 | 165 | 0.86 434 | 294 | 0.75 434 | 430 | 0.95 434 | 431 | 0.93 435 | 159 | 0.71 435 | 165 | 0.87 435 | 294 | 0.75 435 | 430 | 0.95 435 | 431 | 0.93 435 | 434 | 0.99 436 | 159 | 0.71 436 | 165 | 0.88 436 | 294 | 0.75 436 | 430 | 0.95 436 | 431 | 0.93 436 | 434 | 0.99 436 | 435 | 1.0 437 | 166 | 0.99 437 | 301 | 0.95 440 | 27 | 0.71 452 | 180 | 0.99 452 | 316 | 0.86 454 | 182 | 0.99 454 | 318 | 0.97 455 | 183 | 1.0 455 | 319 | 0.98 456 | 185 | 0.71 456 | 321 | 0.72 457 | 185 | 1.0 457 | 187 | 0.81 457 | 321 | 0.99 457 | 323 | 0.8 457 | 456 | 0.71 459 | 185 | 0.82 459 | 187 | 1.0 459 | 321 | 0.82 459 | 323 | 0.99 459 | 457 | 0.81 467 | 195 | 1.0 469 | 67 | 0.87 469 | 196 | 0.9 469 | 197 | 1.0 469 | 198 | 0.72 469 | 199 | 0.83 469 | 203 | 0.71 469 | 204 | 0.81 469 | 205 | 0.85 469 | 207 | 0.88 469 | 332 | 0.92 469 | 333 | 0.99 469 | 335 | 0.88 469 | 336 | 0.91 469 | 340 | 0.82 469 | 341 | 0.86 469 | 343 | 0.89 470 | 198 | 1.0 470 | 333 | 0.71 471 | 196 | 0.83 471 | 199 | 0.94 471 | 202 | 0.73 471 | 203 | 0.8 471 | 204 | 0.74 471 | 207 | 0.74 471 | 332 | 0.72 471 | 335 | 0.83 471 | 336 | 0.72 471 | 338 | 0.72 471 | 339 | 0.79 473 | 201 | 0.87 473 | 337 | 0.76 474 | 201 | 0.74 474 | 202 | 0.71 474 | 337 | 0.71 474 | 473 | 0.79 475 | 196 | 0.79 475 | 199 | 0.77 475 | 200 | 0.76 475 | 202 | 0.87 475 | 203 | 1.0 475 | 204 | 0.78 475 | 207 | 0.83 475 | 209 | 0.73 475 | 338 | 0.88 475 | 339 | 0.99 475 | 343 | 0.76 475 | 347 | 0.73 475 | 471 | 0.8 477 | 67 | 0.92 477 | 196 | 0.88 477 | 197 | 0.85 477 | 199 | 0.83 477 | 204 | 0.87 477 | 205 | 0.99 477 | 207 | 0.88 477 | 332 | 0.91 477 | 333 | 0.85 477 | 335 | 0.87 477 | 336 | 0.89 477 | 340 | 0.88 477 | 341 | 0.99 477 | 343 | 0.89 477 | 469 | 0.85 478 | 202 | 0.83 478 | 209 | 1.0 478 | 338 | 0.87 478 | 339 | 0.78 478 | 347 | 1.0 478 | 475 | 0.73 479 | 67 | 0.85 479 | 196 | 0.91 479 | 197 | 0.87 479 | 199 | 0.88 479 | 200 | 0.75 479 | 203 | 0.88 479 | 204 | 0.88 479 | 205 | 0.86 479 | 207 | 1.0 479 | 332 | 0.89 479 | 333 | 0.86 479 | 335 | 0.88 479 | 336 | 0.89 479 | 338 | 0.71 479 | 339 | 0.84 479 | 340 | 0.85 479 | 341 | 0.86 479 | 343 | 0.97 479 | 469 | 0.87 479 | 471 | 0.76 479 | 475 | 0.85 479 | 477 | 0.87 480 | 208 | 0.8 480 | 344 | 0.78 490 | 218 | 0.98 490 | 356 | 0.93 495 | 223 | 1.0 495 | 361 | 0.97 497 | 225 | 0.99 497 | 363 | 0.96 522 | 225 | 0.71 522 | 250 | 0.99 522 | 388 | 0.96 522 | 497 | 0.72 524 | 117 | 0.98 524 | 252 | 1.0 524 | 390 | 1.0 540 | 268 | 1.0 540 | 406 | 0.97 541 | 269 | 0.97 541 | 407 | 0.91 549 | 546 | 0.73 551 | 550 | 0.72 553 | 550 | 0.98 554 | 550 | 0.73 554 | 551 | 1.0 554 | 553 | 0.7 555 | 549 | 0.88 556 | 550 | 1.0 556 | 551 | 0.74 556 | 553 | 0.98 556 | 554 | 0.75 557 | 550 | 0.72 557 | 551 | 1.0 557 | 554 | 1.0 557 | 556 | 0.75 566 | 564 | 0.98 568 | 564 | 1.0 568 | 566 | 0.98 574 | 572 | 0.99 576 | 572 | 0.99 576 | 574 | 0.99 577 | 572 | 0.86 577 | 574 | 0.85 577 | 576 | 0.86
pdata3.shape
(1567, 130)
Out of 591 features, now we have 130 features, which will have a significant impact on the model building.
#Checking the number of zeroes and NA in the data.
import plotly.graph_objs as go
from plotly.offline import init_notebook_mode, iplot, plot
df=((pdata3 == 0).sum() + pdata3.isna().sum())*100/pdata3.shape[0]
fig = go.Figure()
fig.add_trace(go.Scatter(x=df.index, y=df,mode='lines'))#,name='markers'
fig.layout = dict(title = 'Percentage of zeros + NA in all the features',
xaxis= dict(title= 'Features'),
yaxis= dict(title= 'Percentage of zeros + NA',range=[0,100]))
iplot(fig)
#Check if some column has dominant values
df = pdata3.drop('Pass/Fail',1).nunique()
Drop = df[df<=20]
Drop
521 9 dtype: int64
As we can see from the graph the Column 521 has dominant 0 values.
#Checking if any other features is dominated by any value other than zero
df = pdata3.apply(pd.value_counts).max()*100/pdata3.shape[0]#.sort_values(ascending=False)
fig = go.Figure()
fig.add_trace(go.Scatter(x=df.index, y=df,mode='lines'))#,name='markers'
fig.layout = dict(title = 'Frequency of most frequent element (percentage) in all the features (data: after 3rd modification)',
xaxis= dict(title= 'Features'),
yaxis= dict(title= 'frequency in percentage',range=[0,100]))
iplot(fig)
Evidently no other feature is heavily dominated by a single value.
pdata3.describe()
| 0 | 1 | 2 | 3 | 4 | 6 | 12 | 14 | 15 | 16 | 18 | 21 | 23 | 24 | 27 | 28 | 29 | 31 | 32 | 33 | 38 | 40 | 41 | 43 | 45 | 48 | 51 | 55 | 59 | 62 | 63 | 64 | 67 | 68 | 71 | 83 | 88 | 90 | 98 | 115 | 117 | 122 | 129 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 150 | 151 | 155 | 159 | 160 | 161 | 162 | 166 | 167 | 180 | 182 | 183 | 185 | 188 | 195 | 200 | 201 | 208 | 218 | 223 | 225 | 250 | 268 | 269 | 416 | 417 | 418 | 419 | 423 | 426 | 429 | 432 | 433 | 438 | 439 | 442 | 453 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 491 | 493 | 494 | 496 | 499 | 500 | 510 | 511 | 520 | 521 | 523 | 525 | 526 | 527 | 539 | 545 | 546 | 547 | 548 | 550 | 561 | 562 | 564 | 569 | 570 | 572 | 585 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 1561.000000 | 1560.000000 | 1553.000000 | 1553.000000 | 1553.000000 | 1553.000000 | 1565.000000 | 1564.000000 | 1564.000000 | 1564.000000 | 1564.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1566.000000 | 1566.000000 | 1566.000000 | 1543.000000 | 1543.000000 | 1566.000000 | 1566.000000 | 1566.000000 | 1566.000000 | 1563.000000 | 1560.000000 | 1561.000000 | 1560.000000 | 1560.000000 | 1561.000000 | 1561.000000 | 1561.000000 | 1566.000000 | 1567.000000 | 1516.000000 | 1561.000000 | 1567.000000 | 1567.000000 | 1558.00000 | 1558.000000 | 1559.000000 | 1559.000000 | 1562.000000 | 1561.000000 | 1560.000000 | 1553.000000 | 1553.000000 | 1553.000000 | 1564.000000 | 1564.000000 | 1557.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1566.000000 | 1566.000000 | 1566.000000 | 1566.000000 | 1566.000000 | 1563.000000 | 1560.000000 | 1560.000000 | 1561.000000 | 1566.000000 | 1567.000000 | 1516.000000 | 1567.000000 | 1559.000000 | 1559.000000 | 1558.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1564.000000 | 1564.000000 | 1567.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1566.000000 | 1566.000000 | 1566.000000 | 1560.000000 | 1560.000000 | 1561.000000 | 1543.000000 | 1543.000000 | 1543.000000 | 1543.000000 | 1543.000000 | 1543.000000 | 1543.000000 | 1543.000000 | 1555.000000 | 1567.000000 | 1567.000000 | 1516.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1565.000000 | 1567.000000 | 1567.000000 | 1567.000000 | 1543.000000 | 1567.000000 | 1567.000000 | 1559.000000 | 1565.000000 | 1307.000000 | 1307.000000 | 1307.000000 | 1307.000000 | 1566.000000 | 1294.000000 | 1294.000000 | 1294.000000 | 1567.000000 | 1567.000000 | 1566.000000 | 1566.000000 | 1567.000000 |
| mean | 3014.452896 | 2495.850231 | 2200.547318 | 1396.376627 | 4.197013 | 101.112908 | 199.956809 | 9.005371 | 413.086035 | 9.907603 | 190.047354 | -5618.393610 | -3806.299734 | -298.598136 | 6.638628 | 69.499532 | 2.366197 | 3.673189 | 85.337469 | 8.960279 | 86.836577 | 67.904909 | 3.353066 | 355.538904 | 136.743060 | 139.972231 | 157.420991 | 2856.172105 | 2.960241 | 116.502329 | 13.989927 | 20.542109 | 16.715444 | 147.437578 | 104.329033 | 7.452067 | 1807.815021 | 8827.536865 | -0.018143 | 747.383792 | 58.625908 | 3.89839 | -0.554228 | 1004.043093 | 39.391979 | 117.960948 | 138.194747 | 122.692949 | 57.603025 | 416.766964 | 6.641565 | 6.814268 | 14.047403 | 0.507171 | 882.680511 | 555.346326 | 4066.850479 | 4797.154633 | 2.788882 | 1.235783 | 19.013257 | 10.780543 | 26.661170 | 7.365741 | 43.211418 | 0.287084 | 17.600192 | 7.839359 | 73.264316 | 3.771465 | 122.846571 | 1041.056588 | 109.650967 | 19.504677 | 3.777866 | 3.404349 | 8.190905 | 320.259235 | 309.061299 | 77.660446 | 1.233858 | 4.171844 | 99.367633 | 205.519304 | 54.701052 | 70.643942 | 1.345259 | 5.460971 | 29.197414 | 224.173047 | 137.888406 | 20.132155 | 318.418448 | 206.564196 | 215.288948 | 201.111728 | 302.506186 | 239.455326 | 352.616477 | 272.169707 | 2.442673 | 2.530046 | 0.956442 | 29.865896 | 263.195864 | 240.981377 | 55.763508 | 275.979457 | 2.695999 | 11.610080 | 0.453896 | 5.560397 | 1.443457 | 6.395717 | 3.034235 | 7.611403 | 1.039630 | 403.546477 | 75.679871 | 17.013313 | 32.284956 | 262.729683 | 6.444985 | 21.117674 | 530.523623 | 28.450165 | 3.067826 | 99.670066 | -0.867262 |
| std | 73.621787 | 80.407705 | 29.513152 | 441.691640 | 56.355540 | 6.237214 | 3.257276 | 2.796596 | 17.221095 | 2.403867 | 2.781041 | 626.822178 | 1380.162148 | 2902.690117 | 1.244249 | 3.461181 | 0.408694 | 0.535322 | 2.026549 | 1.344456 | 0.446756 | 24.062943 | 2.360425 | 6.234706 | 7.849247 | 4.524251 | 60.925108 | 25.749317 | 9.532220 | 8.629022 | 7.119863 | 4.977467 | 307.502293 | 4.240095 | 31.651899 | 0.516251 | 53.537262 | 396.313662 | 0.427110 | 48.949250 | 6.485174 | 0.90412 | 1.220479 | 6.537701 | 2.990476 | 57.544627 | 53.909792 | 52.253015 | 12.345358 | 263.300614 | 3.552254 | 3.241843 | 31.002541 | 1.122427 | 983.043021 | 574.808588 | 4239.245058 | 6553.569317 | 1.119756 | 0.632767 | 3.311632 | 4.164051 | 6.836101 | 7.188720 | 21.711876 | 0.395187 | 8.690718 | 5.104495 | 28.067143 | 1.170436 | 55.156003 | 433.170076 | 54.597274 | 7.344404 | 1.152329 | 1.035433 | 4.054515 | 287.704482 | 325.448391 | 32.596933 | 0.995620 | 6.435390 | 126.188715 | 225.778870 | 34.108051 | 38.376178 | 0.659195 | 2.250804 | 13.335189 | 230.766915 | 47.698041 | 14.939590 | 281.011323 | 192.864413 | 213.126638 | 218.690015 | 287.364070 | 263.837645 | 252.043751 | 228.046702 | 1.224283 | 0.973948 | 6.615200 | 24.621586 | 324.771342 | 323.003410 | 37.691736 | 329.664680 | 5.702366 | 103.122996 | 4.147581 | 3.920370 | 0.958428 | 1.888698 | 1.252913 | 1.315544 | 0.389066 | 5.063887 | 3.390523 | 4.966954 | 19.026081 | 7.630585 | 2.633583 | 10.213294 | 17.499736 | 86.304681 | 3.578033 | 93.891919 | 0.498010 |
| min | 2743.240000 | 2158.750000 | 2060.660000 | 0.000000 | 0.681500 | 82.131100 | 182.094000 | 2.249300 | 333.448600 | 4.469600 | 169.177400 | -7150.250000 | -9986.750000 | -14804.500000 | 0.000000 | 59.400000 | 0.666700 | 2.069800 | 83.182900 | 7.603200 | 84.732700 | 1.434000 | -0.075900 | 342.754500 | 108.846400 | 125.798200 | 40.261400 | 2801.000000 | -28.988200 | 81.490000 | 1.659100 | 6.448200 | 0.413700 | 87.025500 | 21.433200 | 5.825700 | 1627.471400 | 7397.310000 | -5.271700 | 544.025400 | 52.806800 | 1.67100 | -3.779000 | 980.451000 | 33.365800 | 58.000000 | 36.100000 | 19.200000 | 19.800000 | 0.000000 | 1.740000 | 1.337000 | 2.020000 | 0.140000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.800000 | 0.300000 | 9.400000 | 3.170000 | 5.014000 | 1.940000 | 6.613000 | 0.080000 | 3.210000 | 0.000000 | 5.359000 | 1.034000 | 32.263700 | 168.799800 | 21.010700 | 6.098000 | 1.301700 | 0.000000 | 2.153100 | 0.000000 | 0.000000 | 23.020000 | 0.363200 | 0.783700 | 0.000000 | 0.000000 | 0.000000 | 14.120600 | 0.097400 | 0.903700 | 7.953400 | 0.000000 | 11.499700 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.555800 | 0.833000 | 0.034200 | 4.813500 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.312100 | 0.000000 | 0.025800 | 1.540000 | 0.170500 | 2.170000 | 0.851600 | 4.429400 | 0.444400 | 372.822000 | 71.038000 | 6.110000 | 7.236900 | 242.286000 | 0.970000 | 3.250400 | 317.196400 | 3.540000 | 1.197500 | 0.000000 | -1.000000 |
| 25% | 2966.260000 | 2452.247500 | 2181.044400 | 1081.875800 | 1.017700 | 97.920000 | 198.130700 | 7.094875 | 406.127400 | 9.567625 | 188.299825 | -5933.250000 | -4371.750000 | -1476.000000 | 5.263700 | 67.377800 | 2.088900 | 3.362700 | 84.490500 | 8.580000 | 86.578300 | 74.800000 | 2.690000 | 350.801575 | 130.728875 | 136.926800 | 115.508975 | 2836.000000 | -1.871575 | 112.022700 | 10.364300 | 17.364800 | 0.890700 | 145.237300 | 87.484200 | 7.104225 | 1777.470300 | 8564.689975 | -0.218800 | 721.023000 | 57.978300 | 3.20200 | -0.898800 | 999.996100 | 37.347250 | 92.000000 | 90.000000 | 81.300000 | 50.900100 | 243.786000 | 5.110000 | 4.459250 | 8.089750 | 0.240000 | 411.000000 | 295.000000 | 1321.000000 | 451.000000 | 2.100000 | 0.900000 | 16.850000 | 7.732500 | 21.171500 | 5.390000 | 24.711000 | 0.218000 | 14.155000 | 5.020000 | 56.158000 | 2.946100 | 95.147350 | 718.725350 | 76.132150 | 13.828000 | 2.956500 | 2.660100 | 5.765500 | 0.000000 | 0.000000 | 55.976675 | 0.743425 | 2.571400 | 31.032400 | 10.027100 | 36.290300 | 48.173800 | 0.907650 | 3.747875 | 20.221850 | 38.472775 | 105.525150 | 11.577100 | 0.000000 | 81.316150 | 76.455400 | 50.383550 | 0.000000 | 55.555150 | 139.914350 | 112.859250 | 1.747100 | 1.663750 | 0.139000 | 16.342300 | 0.000000 | 0.000000 | 35.322200 | 0.000000 | 1.552150 | 0.000000 | 0.073050 | 4.101500 | 0.484200 | 4.895450 | 1.889900 | 7.116000 | 0.797500 | 400.694000 | 73.254000 | 14.530000 | 15.762450 | 259.972500 | 4.980000 | 15.466200 | 530.702700 | 7.500000 | 2.306500 | 44.368600 | -1.000000 |
| 50% | 3011.490000 | 2499.405000 | 2201.066700 | 1285.214400 | 1.316800 | 101.512200 | 199.535600 | 8.967000 | 412.219100 | 9.851750 | 189.664200 | -5523.250000 | -3820.750000 | -78.750000 | 7.264700 | 69.155600 | 2.377800 | 3.431000 | 85.135450 | 8.769800 | 86.820700 | 78.290000 | 3.074000 | 353.720900 | 136.400000 | 140.007750 | 183.318150 | 2854.000000 | 0.947250 | 116.211800 | 13.246050 | 20.021350 | 0.978300 | 147.597300 | 102.604300 | 7.467450 | 1809.249200 | 8825.435100 | 0.000000 | 750.861400 | 58.549100 | 3.87700 | -0.141900 | 1004.050000 | 38.902600 | 109.000000 | 134.600000 | 117.700000 | 55.900100 | 339.561000 | 6.260000 | 5.951000 | 10.993500 | 0.320000 | 623.000000 | 438.000000 | 2614.000000 | 1784.000000 | 2.600000 | 1.200000 | 18.690000 | 10.170000 | 27.200500 | 6.735000 | 40.209500 | 0.259000 | 17.235000 | 6.760000 | 73.248000 | 3.630750 | 119.436000 | 967.299800 | 103.093600 | 17.977000 | 3.703500 | 3.234000 | 7.395600 | 302.177600 | 272.448700 | 69.905450 | 1.135300 | 3.453800 | 57.969300 | 151.115600 | 49.090900 | 65.437800 | 1.264550 | 5.227100 | 26.167850 | 150.340100 | 138.255150 | 15.973800 | 293.518500 | 148.317500 | 138.775500 | 112.953400 | 249.927000 | 112.275500 | 348.529400 | 219.487200 | 2.250800 | 2.529100 | 0.232500 | 22.039100 | 0.000000 | 0.000000 | 46.986100 | 0.000000 | 2.221000 | 0.000000 | 0.100000 | 5.134200 | 1.550100 | 6.410800 | 3.054800 | 7.116000 | 0.911100 | 403.122000 | 74.084000 | 16.340000 | 29.731150 | 264.272000 | 5.160000 | 16.988350 | 532.398200 | 8.650000 | 2.757650 | 71.900500 | -1.000000 |
| 75% | 3056.650000 | 2538.822500 | 2218.055500 | 1591.223500 | 1.525700 | 104.586700 | 202.007100 | 10.861875 | 419.089275 | 10.128175 | 192.189375 | -5356.250000 | -3352.750000 | 1377.250000 | 7.329700 | 72.266700 | 2.655600 | 3.531300 | 85.741900 | 9.060600 | 87.002400 | 80.200000 | 3.521000 | 360.772250 | 142.098225 | 143.195700 | 206.977150 | 2874.000000 | 4.385225 | 120.927300 | 16.376100 | 22.813625 | 1.065000 | 149.959100 | 115.498900 | 7.807625 | 1841.873000 | 9065.432400 | 0.189300 | 776.781850 | 59.133900 | 4.39200 | 0.047300 | 1008.670600 | 40.804600 | 127.000000 | 181.000000 | 161.600000 | 62.900100 | 502.205900 | 7.500000 | 8.275000 | 14.347250 | 0.450000 | 966.000000 | 625.000000 | 5034.000000 | 6384.000000 | 3.200000 | 1.500000 | 20.972500 | 13.337500 | 31.687000 | 8.450000 | 57.674750 | 0.296000 | 20.162500 | 9.490000 | 90.515000 | 4.404750 | 144.502800 | 1261.299800 | 131.758400 | 24.653000 | 4.379400 | 4.010700 | 9.168800 | 524.002200 | 582.935200 | 92.911500 | 1.539500 | 4.755800 | 120.172900 | 305.026300 | 66.666700 | 84.973400 | 1.577825 | 6.902475 | 35.278800 | 335.922400 | 168.410125 | 23.737200 | 514.585900 | 262.865250 | 294.667050 | 288.893450 | 501.607450 | 397.506100 | 510.647150 | 377.144200 | 2.839800 | 3.199100 | 0.563000 | 32.438475 | 536.204600 | 505.401000 | 64.248700 | 555.294100 | 2.903700 | 0.000000 | 0.133200 | 6.329500 | 2.211650 | 7.594250 | 3.947000 | 8.020700 | 1.285550 | 407.431000 | 78.397000 | 19.035000 | 44.113400 | 265.707000 | 7.800000 | 24.772175 | 534.356400 | 10.130000 | 3.295175 | 114.749700 | -1.000000 |
| max | 3356.350000 | 2846.440000 | 2315.266700 | 3715.041700 | 1114.536600 | 129.252200 | 272.045100 | 19.546500 | 824.927100 | 102.867700 | 215.597700 | 0.000000 | 2363.000000 | 14106.000000 | 7.658800 | 77.900000 | 3.511100 | 4.804400 | 105.603800 | 23.345300 | 88.418800 | 86.120000 | 37.880000 | 377.297300 | 176.313600 | 163.250900 | 258.543200 | 2936.000000 | 168.145500 | 287.150900 | 188.092300 | 48.988200 | 7272.828300 | 167.830900 | 238.477500 | 8.990400 | 2105.182300 | 10746.600000 | 2.569800 | 924.531800 | 311.734400 | 6.88900 | 2.458000 | 1020.994400 | 64.128700 | 994.000000 | 295.800000 | 334.700000 | 141.799800 | 1770.690900 | 103.390000 | 22.318000 | 536.564000 | 12.710000 | 7791.000000 | 4170.000000 | 37943.000000 | 36871.000000 | 21.100000 | 16.300000 | 48.670000 | 55.000000 | 72.947000 | 267.910000 | 191.830000 | 4.838000 | 199.620000 | 126.530000 | 172.349000 | 8.801500 | 1768.880200 | 3601.299800 | 1119.704200 | 40.855000 | 10.152900 | 9.690000 | 39.037600 | 999.316000 | 998.681300 | 424.215200 | 24.990400 | 186.616400 | 994.285700 | 995.744700 | 851.612900 | 657.762100 | 5.131700 | 34.490200 | 149.385100 | 999.877000 | 492.771800 | 274.887100 | 999.413500 | 989.473700 | 996.858600 | 994.000000 | 999.491100 | 995.744700 | 997.518600 | 994.003500 | 12.769800 | 9.402400 | 127.572800 | 219.643600 | 1000.000000 | 999.233700 | 451.485100 | 1000.000000 | 111.736500 | 1000.000000 | 111.333000 | 80.040600 | 8.203700 | 14.447900 | 6.580300 | 21.044300 | 3.978600 | 421.702000 | 83.720000 | 131.680000 | 101.114600 | 311.404000 | 32.580000 | 84.802400 | 589.508200 | 454.560000 | 99.303200 | 737.304800 | 1.000000 |
The 5 point summary is not conveying any relevant information as we dont understand the nature of the variables or any of their expected values.
#density plot to check for the distribution of the variables
plt.figure(figsize=(40, 40))
col = 1
float_columns = pdata3.select_dtypes(include=['float64']).columns
for i in float_columns:
plt.subplot(20, 10, col)
sns.distplot(pdata3[i], color = 'b')
col += 1
pdata3['Pass/Fail'].value_counts()
-1 1463 1 104 Name: Pass/Fail, dtype: int64
# pie chart
labels = ['Pass', 'Fail']
size = [104, 1463]
colors = ['#ff9999','#66b3ff']
explode = [0, 0.1]
plt.rcParams['figure.figsize'] = (5, 10)
plt.pie(size, labels =labels, colors = colors, explode = explode, autopct = "%.2f%%", shadow = True)
plt.axis('off')
plt.title('A Pie Chart Representing the no. of tests passes or failed', fontsize = 20)
plt.legend()
plt.show()
As we can see there is a major class imbalance in the given dataset and hence we will have to take care of that while we are buliding the model, otherwise we will have biasing towards the majority class ie Failed.
4. Data pre-processing:
• Segregate predictors vs target attributes
• Check for target balancing and fix it if found imbalanced.
• Perform train-test split and standardise the data or vice versa if required.
• Check if the train and test data have similar statistical characteristics when compared with original data.
#Checking the NAN count after the manipulations.
pdata3.isna().sum().sort_values(ascending=False).min(),pdata3.isna().sum().sort_values(ascending=False).max()
(0, 273)
# Replacing all the NaN values with 0 as the values correspond to the test results.
# since, the values are not present that means the values are not available or calculated
# so better we not take median or mean and replace them with zeros
data = pdata3.replace(np.NaN, 0)
data.isnull().any().any()
False
data.shape
(1567, 130)
print(f'total duplicate rows: {data.duplicated().sum()}') #checking duplicate values
total duplicate rows: 0
#Segregate predictors vs target attributes
X = data.drop(['Pass/Fail','Time'],axis=1) #independent variable
y = data['Pass/Fail'] # the dependent variable
We've already seen the target imbalancing and we will perform various methods to find the best model.
#Perform train-test split and standardise the data or vice versa if required.
#scaling with z-score
pdata4= data.copy()
X.shape, y.shape
((1567, 128), (1567,))
#scaling with z-score
X= X.apply(zscore)
X.head()
| 0 | 1 | 2 | 3 | 4 | 6 | 12 | 14 | 15 | 16 | 18 | 21 | 23 | 24 | 27 | 28 | 29 | 31 | 32 | 33 | 38 | 40 | 41 | 43 | 45 | 48 | 51 | 55 | 59 | 62 | 63 | 64 | 67 | 68 | 71 | 83 | 88 | 90 | 98 | 115 | 117 | 122 | 129 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 150 | 151 | 155 | 159 | 160 | 161 | 162 | 166 | 167 | 180 | 182 | 183 | 185 | 188 | 195 | 200 | 201 | 208 | 218 | 223 | 225 | 250 | 268 | 269 | 416 | 417 | 418 | 419 | 423 | 426 | 429 | 432 | 433 | 438 | 439 | 442 | 453 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 491 | 493 | 494 | 496 | 499 | 500 | 510 | 511 | 520 | 521 | 523 | 525 | 526 | 527 | 539 | 545 | 546 | 547 | 548 | 550 | 561 | 562 | 564 | 569 | 570 | 572 | 585 | 589 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0.139998 | 0.429208 | 0.032735 | 0.059342 | -0.049911 | -0.228536 | 0.348980 | -0.365993 | 0.103291 | 0.063399 | 0.309696 | 0.292325 | -0.174886 | 0.361808 | -2.829940 | -1.216514 | -0.817788 | -0.271316 | -0.637696 | 0.409564 | 0.077902 | -0.220482 | 0.510332 | -0.237419 | -0.733509 | 0.234407 | 0.999689 | -0.101652 | -0.491426 | -0.660718 | 0.309530 | 0.247765 | -0.051265 | 0.172088 | -0.615861 | -0.420652 | -1.124999 | 0.081589 | 0.974932 | 0.025089 | -0.030126 | -1.304635 | 0.413822 | 0.025178 | 0.011337 | 0.093663 | -0.484109 | -0.889819 | -0.810832 | -0.235407 | 0.055090 | -0.778200 | -0.120653 | -0.083930 | 0.137844 | 0.717651 | -0.706911 | -0.675320 | -0.699241 | -0.527389 | 0.582769 | 0.411563 | -1.446671 | -0.228349 | 1.020884 | 0.014306 | -0.825548 | -0.473669 | 0.857677 | -0.247246 | 0.949805 | 2.010145 | 2.020788 | 2.877619 | 0.641129 | -0.608061 | -0.540694 | 0.713511 | -0.948799 | -0.727969 | 0.498091 | -0.087582 | -0.577865 | -0.692371 | -0.296670 | -0.736033 | -0.222694 | -0.970756 | 0.056988 | 0.383531 | -1.516279 | 0.797852 | 1.064924 | 0.456406 | 1.326854 | -0.090939 | 1.896668 | -0.895268 | -1.159050 | -1.171968 | -1.346235 | 0.438514 | 0.184098 | -0.475073 | -0.809806 | -0.745562 | 0.238101 | -0.836511 | -0.124479 | -0.112621 | -0.083086 | -0.591276 | -0.978015 | 0.157239 | -0.758541 | -0.361891 | 0.370338 | 0.392696 | 0.445890 | -0.161887 | 0.531928 | -2.171890 | -1.556019 | -1.422765 | 0.190142 | -0.226018 | -0.196519 | -1.061159 |
| 1 | 0.464020 | -0.105874 | 0.236852 | 0.173847 | -0.059375 | 0.187826 | 0.107753 | 0.413621 | 0.097826 | -0.257726 | 0.183079 | 0.258108 | 0.218468 | -0.462781 | 0.600453 | -0.232288 | -0.231388 | -0.456551 | -0.127751 | 0.620277 | 0.331974 | 0.450272 | -0.222386 | -0.280681 | -0.406395 | 1.038760 | 0.784205 | 0.028134 | -0.224999 | -0.185041 | -0.422165 | -0.244190 | -0.050492 | 0.746756 | -0.669532 | -1.170870 | 2.313733 | -0.082522 | -2.152325 | -0.329673 | 0.006493 | -1.407994 | 0.374966 | -0.011260 | -0.309991 | -0.338726 | -1.053311 | -0.779882 | -0.066156 | -0.729464 | -0.245744 | -0.158770 | -0.030466 | -0.137539 | -0.319104 | -0.862578 | -0.888379 | -0.231166 | -0.521164 | -0.211781 | -0.302354 | -0.151947 | -1.498384 | -0.061370 | 1.832053 | -0.162989 | -1.086191 | -0.794093 | 2.159727 | 0.160828 | 0.101235 | 2.112675 | 1.527628 | 1.386885 | -0.106664 | 0.512554 | 0.216673 | -1.111919 | 0.185367 | -0.477210 | 0.255274 | -0.157359 | -0.719582 | -0.024239 | -0.178775 | -0.858212 | 0.927462 | -0.480915 | 0.844285 | 1.040989 | -1.314967 | 0.722641 | -1.113864 | 0.224751 | -0.994956 | 0.737442 | -0.581752 | 2.221206 | -0.602090 | -1.171968 | 1.016631 | 0.691072 | -0.122839 | -0.654285 | -0.809806 | -0.745562 | 2.273884 | -0.836511 | -0.187140 | -0.112621 | -0.093240 | -0.593708 | -0.584746 | 1.292987 | -0.968339 | -0.361891 | 0.924510 | 0.480768 | 0.406629 | 0.232418 | -0.743805 | -2.171890 | -1.556019 | -1.422765 | 0.256816 | -0.261137 | 0.385516 | 1.156951 |
| 2 | -0.351256 | 0.407233 | 0.026413 | 0.684661 | -0.047236 | -0.415634 | 0.295231 | 0.187040 | 0.176936 | -0.235386 | 0.344766 | 0.248603 | -0.177232 | -0.558042 | 0.749649 | -0.535129 | -0.071657 | -0.126942 | -0.177897 | -0.216922 | -0.915332 | -2.076132 | 0.896880 | 0.829567 | -0.566241 | 0.209368 | 0.466057 | 0.595093 | 2.195341 | -0.038229 | -0.366557 | -0.830023 | -0.051438 | -0.106871 | -0.594491 | -0.624959 | -2.278834 | 0.481328 | -0.292444 | -0.588677 | -0.022382 | -1.048349 | 0.297252 | -0.006585 | 0.700043 | -0.494385 | -0.206852 | -0.486087 | -0.892738 | -0.403862 | 0.486841 | -0.609656 | -0.180100 | -0.030320 | -0.325210 | 0.406130 | -0.779357 | -0.410874 | -0.610203 | 0.261630 | -0.365150 | 0.608192 | -0.543663 | 0.257281 | 0.779094 | -0.373210 | -0.090490 | -0.530329 | 0.233866 | -1.143920 | 1.081806 | -0.360949 | -0.096130 | 1.370518 | 0.131104 | -0.304499 | -0.728201 | -1.111919 | -0.948799 | -0.820891 | 0.597809 | -0.101012 | -0.638183 | -0.423970 | 0.157297 | 0.009430 | -1.391649 | 0.924885 | 0.238301 | -0.874904 | 0.221977 | -0.445433 | 0.428881 | -0.267542 | -0.994956 | -0.035012 | 1.562032 | -0.614014 | -0.612647 | -0.077957 | -1.127837 | 1.431491 | 0.087593 | -0.399831 | -0.809806 | -0.745562 | 4.908328 | -0.095153 | 0.046768 | -0.112621 | -0.091359 | 1.698854 | -0.833253 | 0.032352 | -0.682193 | -0.361891 | -0.138891 | 0.496334 | 0.407123 | 0.252968 | -0.394713 | 0.501601 | -1.234416 | 4.194421 | 0.257279 | -0.199823 | 0.030373 | -0.178407 |
| 3 | -0.070903 | -0.025985 | 0.086766 | -1.033387 | -0.050620 | 0.354494 | 0.273601 | 0.218770 | 0.400773 | -0.080439 | 0.282228 | 0.217428 | -0.515011 | -0.468643 | 0.540791 | -1.521962 | 0.674474 | -0.521564 | -0.125959 | -0.202317 | -0.042209 | 0.396881 | -0.850790 | 0.705958 | -0.566241 | 0.463448 | 0.534080 | 0.595093 | 2.253659 | 0.011190 | -0.051340 | -0.937707 | -0.051090 | 0.077830 | -1.045574 | 0.264377 | -1.041069 | -0.207250 | 0.083944 | -0.782633 | 0.005798 | -0.784679 | 0.685818 | -0.256998 | 0.466763 | 0.162845 | -0.361087 | 0.029477 | -0.691683 | -0.945862 | -0.298668 | 0.698713 | 0.039410 | -0.137539 | -0.022953 | -0.347437 | -0.148350 | -0.272849 | -0.966358 | -0.527389 | -0.831634 | 0.934309 | -0.594647 | -0.232524 | 0.432081 | -0.373210 | -0.452875 | -0.368163 | 1.126857 | 1.508863 | 0.530337 | -0.402307 | -0.285006 | 1.666867 | -0.476790 | -0.865370 | -0.128104 | -0.007350 | -0.948799 | 0.517772 | 0.313920 | 0.127143 | -0.183998 | -0.106313 | -0.603482 | -0.755555 | -1.416966 | 0.674929 | -0.112113 | -0.862251 | -0.769149 | 1.028031 | -0.314486 | -0.533240 | 0.440043 | -0.501403 | -0.672088 | -0.622807 | -1.368352 | 0.300708 | -0.530819 | 0.669298 | -0.122128 | 0.014539 | -0.809806 | 1.458969 | 1.534598 | -0.836511 | -0.044627 | -0.112621 | -0.085908 | 0.324193 | -0.733787 | 0.048082 | -0.680064 | -0.361891 | 0.567250 | 0.241239 | 0.329025 | 15.090144 | 1.602238 | 0.513254 | 0.584104 | 0.619876 | 0.002548 | -0.221613 | -0.282803 | -0.274469 |
| 4 | 0.146544 | 0.098340 | 0.250931 | -0.125070 | -0.046823 | 0.016475 | 0.285608 | 0.559439 | 0.332726 | 0.184487 | 0.219211 | 0.205262 | -0.134288 | 0.143182 | 0.509419 | -1.545458 | 1.900518 | -0.718603 | 0.352978 | -0.137143 | -0.283457 | 0.376711 | -0.459614 | -0.180441 | 4.627127 | -0.312713 | 1.249324 | 0.110104 | -1.602699 | 2.492136 | 1.123988 | 2.299714 | -0.051255 | -0.020860 | -1.181688 | 0.099038 | 0.384333 | 0.293793 | -1.530722 | 1.005673 | -0.037036 | -0.767805 | -0.013682 | -0.080578 | -0.255419 | 0.024481 | 0.101619 | 0.018104 | -0.617216 | -0.440604 | -0.819556 | 0.677799 | -0.048904 | -0.066060 | -0.185785 | -0.472742 | -0.546210 | -0.511186 | 0.992496 | -0.685192 | -1.124681 | 0.018305 | 1.612689 | -0.470468 | 1.333237 | -0.092071 | 0.257034 | -0.446315 | 0.681172 | 0.310799 | -0.973833 | 0.859924 | 0.724924 | 0.061296 | -0.120561 | -0.060163 | -0.975740 | -1.111919 | 1.713255 | 0.235665 | 0.063069 | -0.046079 | -0.439501 | -0.909497 | -0.857838 | -0.861080 | -0.946867 | -1.219514 | 0.128389 | -0.772878 | -0.976534 | 1.512823 | -0.504789 | 0.380740 | 1.172734 | 0.193032 | -1.036177 | 1.334957 | 1.580163 | -1.171968 | -0.152036 | 0.761221 | -0.139003 | 3.179479 | 0.093284 | -0.745562 | 2.449703 | -0.836511 | -0.025401 | -0.112621 | -0.094615 | -0.628050 | 0.808501 | -0.011237 | -0.757831 | -0.361891 | 1.135896 | 0.421618 | 0.566073 | 0.698649 | -0.536710 | -2.171890 | -1.556019 | -1.422765 | 0.085279 | -0.227409 | 26.907579 | -0.274469 |
#label encoding the target class
y=y.replace([-1,1],[0,1])
y.head()
0 0 1 0 2 1 3 0 4 0 Name: Pass/Fail, dtype: int64
# splitting data training dataset into train and test set for independent attributes
X_train, X_test, Y_train, Y_test =train_test_split(X,y, test_size=.30,random_state=14,stratify=y)
print("Training Fail: {0} ({1:0.2f}%)".format(len(Y_train[Y_train[:] == 1]), (len(Y_train[Y_train[:] == 1])/len(Y_train)) * 100))
print("Training Pass: {0} ({1:0.2f}%)".format(len(Y_train[Y_train[:] == 0]), (len(Y_train[Y_train[:] == 0])/len(Y_train)) * 100))
print("")
print("Test Fail: {0} ({1:0.2f}%)".format(len(Y_test[Y_test[:] == 1]), (len(Y_test[Y_test[:] == 1])/len(Y_test)) * 100))
print("Test Pass: {0} ({1:0.2f}%)".format(len(Y_test[Y_test[:] == 0]), (len(Y_test[Y_test[:] == 0])/len(Y_test)) * 100))
print("")
Training Fail: 73 (6.66%) Training Pass: 1023 (93.34%) Test Fail: 31 (6.58%) Test Pass: 440 (93.42%)
# Initializaing various classification algorithms with normal dataset and choosing the best model based on f1 score for tuning
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=55,shuffle=True)
cv_results = cross_val_score(model, X_train, Y_train, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 7.318182% (9.357223%) KNN: 2.222222% (6.666667%) GNB: 13.418033% (2.408454%) SVM: 7.333333% (9.043107%) DT: 18.984021% (12.344026%) RF: 0.000000% (0.000000%) AB: 19.676768% (14.481678%) GBT: 5.484848% (8.411425%) XGB: 0.000000% (0.000000%) LightGBM: 0.000000% (0.000000%)
# Implementing random under sampling
under= RandomUnderSampler(sampling_strategy=0.5)
X_under, y_under= under.fit_resample(X_train, Y_train)
print("Under Training Fail : {0} ({1:0.2f}%)".format(len(y_under[y_under[:] == 1]), (len(y_under[y_under[:] == 1])/len(y_under)) * 100))
print("under Training Pass : {0} ({1:0.2f}%)".format(len(y_under[y_under[:] == 0]), (len(y_under[y_under[:] == 0])/len(y_under)) * 100))
Under Training Fail : 73 (33.33%) under Training Pass : 146 (66.67%)
# Initializaing various classification algorithms with random under sampler dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=66,shuffle=True)
cv_results = cross_val_score(model, X_under, y_under, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 50.575307% (19.606195%) KNN: 38.449883% (22.288522%) GNB: 33.614219% (21.425752%) SVM: 42.944570% (15.575726%) DT: 40.158730% (7.633914%) RF: 20.777778% (11.964969%) AB: 50.697636% (16.338983%) GBT: 38.760684% (21.999716%) XGB: 36.726496% (14.766934%) LightGBM: 30.590909% (22.454775%)
# Implementing SMOTE
smt = SMOTE(sampling_strategy=0.5)
X_SMOTE, y_SMOTE = smt.fit_resample(X_train, Y_train)
print("SMOTE Training Fail : {0} ({1:0.2f}%)".format(len(y_SMOTE[y_SMOTE[:] == 1]), (len(y_SMOTE[y_SMOTE[:] == 1])/len(y_SMOTE)) * 100))
print("SMOTE Training Pass : {0} ({1:0.2f}%)".format(len(y_SMOTE[y_SMOTE[:] == 0]), (len(y_SMOTE[y_SMOTE[:] == 0])/len(y_SMOTE)) * 100))
SMOTE Training Fail : 511 (33.31%) SMOTE Training Pass : 1023 (66.69%)
# Initializaing various classification algorithms with Smote dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=25,shuffle=True)
cv_results = cross_val_score(model, X_SMOTE, y_SMOTE, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 76.434131% (4.317839%) KNN: 67.760122% (2.058935%) GNB: 58.990944% (2.757849%) SVM: 78.138911% (4.649216%) DT: 81.501924% (3.059513%) RF: 96.071893% (1.219916%) AB: 86.803825% (3.017209%) GBT: 93.583102% (2.707140%) XGB: 96.081117% (1.641121%) LightGBM: 96.949087% (1.397378%)
# Implementing random over sampling
over= RandomOverSampler(sampling_strategy=0.5)
X_over, y_over= over.fit_resample(X_train, Y_train)
print("over Training Fail : {0} ({1:0.2f}%)".format(len(y_over[y_over[:] == 1]), (len(y_over[y_over[:] == 1])/len(y_over)) * 100))
print("over Training Pass : {0} ({1:0.2f}%)".format(len(y_over[y_over[:] == 0]), (len(y_over[y_over[:] == 0])/len(y_over)) * 100))
over Training Fail : 511 (33.31%) over Training Pass : 1023 (66.69%)
# Initializaing various classification algorithms with over sampled dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=69,shuffle=True)
cv_results = cross_val_score(model, X_over, y_over, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 76.969385% (4.426584%) KNN: 86.399309% (3.411599%) GNB: 56.891672% (2.068149%) SVM: 79.690826% (3.845625%) DT: 91.407502% (2.699546%) RF: 99.902913% (0.291262%) AB: 92.355846% (3.119171%) GBT: 98.296291% (1.721297%) XGB: 98.659287% (1.141959%) LightGBM: 99.902913% (0.291262%)
oversample = ADASYN(sampling_strategy=0.5)
X_adasyn, y_adasyn = oversample.fit_resample(X_train, Y_train)
print("ADASYN Training Fail : {0} ({1:0.2f}%)".format(len(y_adasyn[y_adasyn[:] == 1]), (len(y_adasyn[y_adasyn[:] == 1])/len(y_adasyn)) * 100))
print("ADASYN Training Pass : {0} ({1:0.2f}%)".format(len(y_adasyn[y_adasyn[:] == 0]), (len(y_adasyn[y_adasyn[:] ==0] )/len(y_adasyn)) * 100))
ADASYN Training Fail : 528 (34.04%) ADASYN Training Pass : 1023 (65.96%)
# Initializaing various classification algorithms with ADASYN dataset and choosing the best model based on f1 score
models = []
models.append(("LR", LogisticRegression()))
models.append(("KNN", KNeighborsClassifier()))
models.append(("GNB", GaussianNB()))
models.append(("SVM", SVC(kernel='linear')))
models.append(("DT", DecisionTreeClassifier()))
models.append(("RF", RandomForestClassifier()))
models.append(("AB", AdaBoostClassifier()))
models.append(("GBT", GradientBoostingClassifier()))
models.append(("XGB", XGBClassifier(verbosity=0)))
models.append(("LightGBM",LGBMClassifier()))
#testing models
results = []
names = []
for name, model in models:
kfold = StratifiedKFold(n_splits=10, random_state=33,shuffle=True)
cv_results = cross_val_score(model, X_adasyn, y_adasyn, cv=kfold, scoring='f1')
results.append(cv_results)
names.append(name)
msg = '%s: %f%% (%f%%)' % (name, cv_results.mean()*100, cv_results.std()*100)
print(msg)
LR: 76.298823% (3.114821%) KNN: 68.493469% (3.073125%) GNB: 59.176173% (1.746963%) SVM: 79.403104% (3.396940%) DT: 81.828630% (6.250010%) RF: 97.772339% (0.992679%) AB: 87.075018% (3.013979%) GBT: 93.895434% (2.951183%) XGB: 95.917887% (2.139443%) LightGBM: 96.907456% (1.442754%)
nb = GaussianNB()
nb.fit(X_train, Y_train)
GaussianNB()
modelnb_score = nb.score(X_train, Y_train)
print('Accuracy Score of Training Data: ', modelnb_score)
Accuracy Score of Training Data: 0.39324817518248173
y_predictnb= nb.predict(X_test)
modelnb_score = accuracy_score(Y_test, y_predictnb)
print('Accuracy Score of Test Data:', modelnb_score)
Accuracy Score of Test Data: 0.3524416135881104
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictnb))
Classification Report
precision recall f1-score support
0 0.97 0.32 0.48 440
1 0.08 0.84 0.15 31
accuracy 0.35 471
macro avg 0.52 0.58 0.31 471
weighted avg 0.91 0.35 0.46 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictnb)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.xlabel('Predicted Classes', fontsize = 15)
plt.ylabel('Actual Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB', fontsize = 15);
#Plotting ROC and AUC
probs = nb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_nb = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='Gaussian Naive Bayes (AUC = %0.2f)' % roc_auc_nb)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 38 0.384091 0.612903 0.615909 -0.003006 0.999755
# store the predicted probabilities for Failed Class.
y_pred_prob = nb.predict_proba(X_test)[:, 1]
# predict diabetes if the predicted probability is greater than 0.999755
y_pred_class = binarize([y_pred_prob], 0.999755)[0]
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB', fontsize = 15);
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.10 0.61 0.17 31
0 0.96 0.62 0.75 440
accuracy 0.62 471
macro avg 0.53 0.62 0.46 471
weighted avg 0.90 0.62 0.71 471
precision_nb, recall_nb, f1_score_nb, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_nb)
print('Recall Score :', '%0.2f' % recall_nb)
print('F1-Score:', '%0.2f' % f1_score_nb)
nb_acc= accuracy_score(Y_test, y_predictnb)
print('Accuracy Score :','%0.2f' % nb_acc)
print('AUC :','%0.2f' % roc_auc_nb)
print('Thresholdnb :','%0.2f' % 0.999755)
Thresholdnb=0.999755
Precision Score : 0.53 Recall Score : 0.62 F1-Score: 0.46 Accuracy Score : 0.35 AUC : 0.65 Thresholdnb : 1.00
nbu = GaussianNB()
nbu.fit(X_under, y_under)
GaussianNB()
modelnbu_score = nbu.score(X_under,y_under)
print('Accuracy Score of Training Data: ', modelnbu_score)
Accuracy Score of Training Data: 0.7579908675799086
y_predictnbu= nbu.predict(X_test)
modelnbu_score = accuracy_score(Y_test, y_predictnbu)
print('Accuracy Score of Test Data:', modelnbu_score)
Accuracy Score of Test Data: 0.8535031847133758
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictnbu, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.09 0.13 0.10 31
0 0.94 0.90 0.92 440
accuracy 0.85 471
macro avg 0.51 0.52 0.51 471
weighted avg 0.88 0.85 0.87 471
#visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictnbu)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB Under sampled', fontsize = 15);
#Plotting ROC and AUC
probs = nbu.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_nbu = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='Gaussian NB under sampled (AUC = %0.2f)' % roc_auc_nbu)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 40 0.420455 0.612903 0.579545 0.033358 2.999035e-08
# store the predicted probabilities for failed class
y_pred_prob = nbu.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.00055
y_pred_class = binarize([y_pred_prob], 0.00055)[0]
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for GNB Under sampled', fontsize = 15);
precision_nbu, recall_nbu, f1_score_nbu, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_nbu)
print('Recall Score :', '%0.2f' % recall_nbu)
print('F1-Score:', '%0.2f' % f1_score_nbu)
nbu_acc= accuracy_score(Y_test, y_predictnbu)
print('Accuracy Score :','%0.2f' % nbu_acc)
print('AUC :','%0.2f' % roc_auc_nbu)
print('Thresholdnbu:','%0.2f' % 0.00055)
Thresholdnbu=0.00055
Precision Score : 0.54 Recall Score : 0.59 F1-Score: 0.54 Accuracy Score : 0.85 AUC : 0.64 Thresholdnbu: 0.00
param_test ={'num_leaves': sp_randint(6, 50),
'min_child_samples': sp_randint(100, 500),
'min_child_weight': [1e-5, 1e-3, 1e-2, 1e-1, 1, 1e1, 1e2, 1e3, 1e4],
'subsample': sp_uniform(loc=0.2, scale=0.8),
'colsample_bytree': sp_uniform(loc=0.4, scale=0.6),
'reg_alpha': [0, 1e-1, 1, 2, 5, 7, 10, 50, 100],
'reg_lambda': [0, 1e-1, 1, 5, 10, 20, 50, 100]}
sample = 100
#n_estimators is set to a "large value". The actual number of trees build will depend on early stopping and 5000 define only the absolute maximum
lgb = LGBMClassifier(max_depth=-1, random_state=31, silent=True, metric='f1', n_jobs=4, n_estimators=2000)
gs = RandomizedSearchCV(
estimator=lgb, param_distributions=param_test,
n_iter=sample,
scoring='f1',
cv=5,
refit=True,
random_state=314,
verbose=True)
gs.fit(X_SMOTE, y_SMOTE)
gs.best_params_
Fitting 5 folds for each of 100 candidates, totalling 500 fits
{'colsample_bytree': 0.952164731370897,
'min_child_samples': 111,
'min_child_weight': 0.01,
'num_leaves': 38,
'reg_alpha': 0,
'reg_lambda': 0.1,
'subsample': 0.3029313662262354}
lgb=LGBMClassifier(colsample_bytree=0.95,
min_child_samples= 111,
min_child_weight= 0.01,
num_leaves= 38,
reg_alpha= 0,
reg_lambda= 0.1,
subsample=0.30)
lgb.fit(X_SMOTE,y_SMOTE)
LGBMClassifier(colsample_bytree=0.95, min_child_samples=111,
min_child_weight=0.01, num_leaves=38, reg_alpha=0,
reg_lambda=0.1, subsample=0.3)
modellgb1=lgb.score(X_SMOTE,y_SMOTE)
print('Accuracy Score of Training Data: ', modellgb1)
Accuracy Score of Training Data: 1.0
y_predictlg1= lgb.predict(X_test)
modellg1 = accuracy_score(Y_test, y_predictlg1)
print('Accuracy Score of Test Data:', modellg1)
Accuracy Score of Test Data: 0.9278131634819533
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictlg1, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.29 0.06 0.11 31
0 0.94 0.99 0.96 440
accuracy 0.93 471
macro avg 0.61 0.53 0.53 471
weighted avg 0.89 0.93 0.91 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictlg1)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM Smote', fontsize = 15);
#Plotting ROC and AUC
probs = lgb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_lg = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='LGBM Smote sampled (AUC = %0.2f)' % roc_auc_lg)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 35 0.370455 0.645161 0.629545 0.015616 0.036691
# store the predicted probabilities for failed class
y_pred_prob = lgb.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.039903
y_pred_class = binarize([y_pred_prob], 0.039903)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.11 0.61 0.18 31
0 0.96 0.64 0.77 440
accuracy 0.64 471
macro avg 0.53 0.63 0.48 471
weighted avg 0.90 0.64 0.73 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM Smote', fontsize = 15);
precision_lg, recall_lg, f1_score_lg, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_lg)
print('Recall Score :', '%0.2f' % recall_lg)
print('F1-Score:', '%0.2f' % f1_score_lg)
lg_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % lg_acc)
print('AUC :','%0.2f' % roc_auc_lg)
print('Thresholdlg :','%0.2f' % 0.039903)
Thresholdlg=0.039903
Precision Score : 0.53 Recall Score : 0.63 F1-Score: 0.48 Accuracy Score : 0.64 AUC : 0.71 Thresholdlg : 0.04
# Number of trees in random forest
n_estimators = [int(x) for x in np.linspace(start = 50, stop = 500, num = 50)]
# Number of features to consider at every split
max_features = ['auto', 'sqrt']
# Maximum number of levels in tree
max_depth = [int(x) for x in np.linspace(10, 110, num = 11)]
max_depth.append(None)
# Minimum number of samples required to split a node
min_samples_split = range(2,100,5)
# Minimum number of samples required at each leaf node
min_samples_leaf = range(1,100,10)
# Method of selecting samples for training each tree
bootstrap = [True, False]
# Create the random grid
random_grid = {'n_estimators': n_estimators,
'max_features': max_features,
'max_depth': max_depth,
'min_samples_split': min_samples_split,
'min_samples_leaf': min_samples_leaf,
'bootstrap': bootstrap,
'criterion':['gini','entropy']}
rf = RandomForestClassifier()
rf_random = RandomizedSearchCV(estimator = rf, param_distributions = random_grid, cv = 5, verbose=2, random_state=90, n_jobs = -1)
rf_random.fit(X_over, y_over)
rf_random.best_params_
Fitting 5 folds for each of 10 candidates, totalling 50 fits
{'n_estimators': 463,
'min_samples_split': 82,
'min_samples_leaf': 1,
'max_features': 'sqrt',
'max_depth': 110,
'criterion': 'gini',
'bootstrap': False}
rf_grid1 = RandomForestClassifier(n_estimators=463,
min_samples_split= 82,
min_samples_leaf=1,
max_features= 'sqrt',
max_depth= 110,
criterion= 'gini',
bootstrap= False)
rf_grid1.fit(X_over, y_over)
RandomForestClassifier(bootstrap=False, max_depth=110, max_features='sqrt',
min_samples_split=82, n_estimators=463)
modelrfg1_score=rf_grid1.score(X_over,y_over)
print('Accuracy Score of Training Data: ', modelrfg1_score)
Accuracy Score of Training Data: 1.0
y_predictrfg1= rf_grid1.predict(X_test)
modelrfg1_score = accuracy_score(Y_test, y_predictrfg1)
print('Accuracy Score of Test Data:', modelrfg1_score)
Accuracy Score of Test Data: 0.9341825902335457
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictrfg1, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.00 0.00 0.00 31
0 0.93 1.00 0.97 440
accuracy 0.93 471
macro avg 0.47 0.50 0.48 471
weighted avg 0.87 0.93 0.90 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_predictrfg1)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for RF Over Sampled', fontsize = 15);
#Plotting ROC and AUC
probs = rf_grid1.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_rfo = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='RF over sampled (AUC = %0.2f)' % roc_auc_rfo)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 42 0.311364 0.677419 0.688636 -0.011217 0.180128
# store the predicted probabilities for failed class
y_pred_prob = rf_grid1.predict_proba(X_test)[:, 1]
# predict fail if the predicted probability is greater than 0.1688
y_pred_class = binarize([y_pred_prob], 0.1688)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.13 0.81 0.23 31
0 0.98 0.63 0.77 440
accuracy 0.64 471
macro avg 0.56 0.72 0.50 471
weighted avg 0.92 0.64 0.73 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for RF Over Sampled', fontsize = 15);
precision_rfo, recall_rfo, f1_score_rfo, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_rfo)
print('Recall Score :', '%0.2f' % recall_rfo)
print('F1-Score:', '%0.2f' % f1_score_rfo)
rfo_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % rfo_acc)
print('AUC :','%0.2f' % roc_auc_rfo)
print('Thresholdrf :','%0.2f' % 0.18087)
Thresholdrf=0.18087
Precision Score : 0.56 Recall Score : 0.72 F1-Score: 0.50 Accuracy Score : 0.64 AUC : 0.74 Thresholdrf : 0.18
param_test ={'num_leaves': sp_randint(6, 50),
'min_child_samples': sp_randint(100, 500),
'min_child_weight': [1e-5, 1e-3, 1e-2, 1e-1, 1, 1e1, 1e2, 1e3, 1e4],
'subsample': sp_uniform(loc=0.2, scale=0.8),
'colsample_bytree': sp_uniform(loc=0.4, scale=0.6),
'reg_alpha': [0, 1e-1, 1, 2, 5, 7, 10, 50, 100],
'reg_lambda': [0, 1e-1, 1, 5, 10, 20, 50, 100]}
sample = 100
#n_estimators is set to a "large value". The actual number of trees build will depend on early stopping and 5000 define only the absolute maximum
lgb = LGBMClassifier(max_depth=-1, random_state=31, silent=True, metric='f1', n_jobs=4, n_estimators=2000)
gs = RandomizedSearchCV(
estimator=lgb, param_distributions=param_test,
n_iter=sample,
scoring='f1',
cv=5,
refit=True,
random_state=314,
verbose=True)
gs.fit(X_adasyn, y_adasyn)
gs.best_params_
Fitting 5 folds for each of 100 candidates, totalling 500 fits
{'colsample_bytree': 0.952164731370897,
'min_child_samples': 111,
'min_child_weight': 0.01,
'num_leaves': 38,
'reg_alpha': 0,
'reg_lambda': 0.1,
'subsample': 0.3029313662262354}
lgb=LGBMClassifier(colsample_bytree=0.95,
min_child_samples= 111,
min_child_weight= 0.01,
num_leaves= 38,
reg_alpha= 0,
reg_lambda= 0.1,
subsample=0.30)
lgb.fit(X_adasyn,y_adasyn)
LGBMClassifier(colsample_bytree=0.95, min_child_samples=111,
min_child_weight=0.01, num_leaves=38, reg_alpha=0,
reg_lambda=0.1, subsample=0.3)
modellgb=lgb.score(X_adasyn,y_adasyn)
print('Accuracy Score of Training Data: ', modellgb)
Accuracy Score of Training Data: 1.0
y_predictlg= lgb.predict(X_test)
modellg = accuracy_score(Y_test, y_predictlg)
print('Accuracy Score of Test Data:', modellg)
Accuracy Score of Test Data: 0.9193205944798302
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_predictlg, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.23 0.10 0.14 31
0 0.94 0.98 0.96 440
accuracy 0.92 471
macro avg 0.58 0.54 0.55 471
weighted avg 0.89 0.92 0.90 471
#Plotting ROC and AUC
probs = lgb.predict_proba(X_test)
preds = probs[:,1]
fpr, tpr, threshold = metrics.roc_curve(Y_test, preds)
roc_auc_lg1 = metrics.auc(fpr, tpr)
plt.plot(fpr, tpr, label='LGBM ADASYN (AUC = %0.2f)' % roc_auc_lg1)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([-0.02, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
i = np.arange(len(tpr)) # index for df
roc = pd.DataFrame({'fpr' : pd.Series(fpr, index=i),'tpr' : pd.Series(tpr, index = i), '1-fpr' : pd.Series(1-fpr, index = i), 'tf' : pd.Series(tpr - (1-fpr), index = i), 'threshold' : pd.Series(threshold, index = i)})
print(roc.loc[(roc.tf-0).abs().argsort()[:1]])
fpr tpr 1-fpr tf threshold 37 0.338636 0.677419 0.661364 0.016056 0.042971
# store the predicted probabilities for failed class
y_pred_prob = lgb.predict_proba(X_test)[:, 1]
# predict fail for predicted probability is greater than 0.039017
y_pred_class = binarize([y_pred_prob], 0.039017)[0]
#printing classification report
print("Classification Report")
print(metrics.classification_report(Y_test, y_pred_class, labels=[1, 0]))
Classification Report
precision recall f1-score support
1 0.12 0.68 0.20 31
0 0.97 0.63 0.77 440
accuracy 0.64 471
macro avg 0.54 0.66 0.48 471
weighted avg 0.91 0.64 0.73 471
# visualizing confusion matrix
cm= confusion_matrix(Y_test, y_pred_class)
plt.figure(figsize = (6, 4))
sns.heatmap(cm, annot = True, cmap = 'RdYlGn', fmt = 'd')
plt.ylabel('Actual Classes', fontsize = 15)
plt.xlabel('Predicted Classes', fontsize = 15)
plt.title('Confusion Matrix for LGBM ADASYN', fontsize = 15);
precision_lg1, recall_lg1, f1_score_lg1, support = precision_recall_fscore_support(Y_test, y_pred_class, average = 'macro')
print('Precision Score :', '%0.2f' % precision_lg1)
print('Recall Score :', '%0.2f' % recall_lg1)
print('F1-Score:', '%0.2f' % f1_score_lg1)
lg1_acc= accuracy_score(Y_test, y_pred_class)
print('Accuracy Score :','%0.2f' % lg1_acc)
print('AUC :','%0.2f' % roc_auc_lg1)
print('Thresholdlg1 :','%0.2f' % 0.039017)
Thresholdlg1=0.039017
Precision Score : 0.54 Recall Score : 0.66 F1-Score: 0.48 Accuracy Score : 0.64 AUC : 0.74 Thresholdlg1 : 0.04
modellists = []
modellists.append(['Gaussian NB Normal Data', nb_acc * 100, recall_nb * 100, precision_nb * 100,roc_auc_nb*100,f1_score_nb*100,Thresholdnb])
modellists.append(['Gausian NB under samples data', nbu_acc* 100, recall_nbu * 100, precision_nbu* 100,roc_auc_nbu*100,f1_score_nbu*100,Thresholdnbu])
modellists.append(['LGBM Smote sampled Data', lg_acc * 100, recall_lg * 100, precision_lg * 100,roc_auc_lg*100,f1_score_lg*100,Thresholdlg])
modellists.append(['Random Forest Over sampled Data', rfo_acc * 100, recall_rfo * 100, precision_rfo * 100,roc_auc_rfo*100,f1_score_rfo*100,Thresholdrf])
modellists.append(['LGBM ADASYN sampled Data', lg1_acc * 100, recall_lg1 * 100, precision_lg1 * 100,roc_auc_lg1*100,f1_score_lg1*100,Thresholdlg1])
model_df = pd.DataFrame(modellists, columns = ['Model', 'Accuracy Scores on Test', 'Recall Score', 'Precision Score','AUC','F1 Score','Threshold'])
model_df
| Model | Accuracy Scores on Test | Recall Score | Precision Score | AUC | F1 Score | Threshold | |
|---|---|---|---|---|---|---|---|
| 0 | Gaussian NB Normal Data | 35.244161 | 61.554252 | 52.967538 | 64.802053 | 46.284657 | 0.999755 |
| 1 | Gausian NB under samples data | 85.350318 | 58.629032 | 53.920720 | 63.995601 | 54.136214 | 0.000550 |
| 2 | LGBM Smote sampled Data | 63.906582 | 62.690616 | 53.326415 | 71.224340 | 47.554234 | 0.039903 |
| 3 | Random Forest Over sampled Data | 64.118896 | 71.799853 | 55.588865 | 74.479472 | 49.728112 | 0.180870 |
| 4 | LGBM ADASYN sampled Data | 63.694268 | 65.575513 | 54.039127 | 73.988270 | 48.130760 | 0.039017 |
Among the given models, Random Forest with Over Sampled Data gives the best results.
def display_sbs(*args):
# Objective: To display dataframes side by side, for clearer and concise presentation
# Application: Simply pass two dataframes as arguments. * Works only for dataframes
from IPython.display import display_html
html_str=''
for df in args:
html_str+=df.to_html()
display_html(html_str.replace('table','table style="display:inline"'),raw=True)
return
def find_pca(data,var=95, verbosity=0):
var/=100
for i in range(1,data.shape[1]+1):
pca = PCA(n_components=i, random_state=14, whiten=True)
pca_data = pca.fit_transform(data)
#print(pca.explained_variance_ratio_)
if np.cumsum(pca.explained_variance_ratio_)[-1] >=var:
if verbosity == 1:
evr = np.cumsum(pca.explained_variance_ratio_)
#print("Overall variances captured: ",evr)
#print('variances: ', pca.explained_variance_ratio_)
fig = px.area(
x=range(1, evr.shape[0] + 1),
y=evr,
labels={"x": "# Components", "y": "Explained Variance"}
)
fig.show()
else:
print("Overall variances captured: ",np.cumsum(pca.explained_variance_ratio_)[-1])
break
return i, pca
# To save scores of different models in a proper format
cv_scores=pd.DataFrame(index=['mean','std'])
scores = pd.DataFrame(index=['train','test','CV'])
def save_scores(name,cv,test,train):
global cv_scores
global scores
cv_scores.loc['mean',name] = cv[0]
cv_scores.loc['std',name] = cv[1]
scores.loc['train',name] = train
scores.loc['test',name] = test
scores.loc['CV',name] = cv[0]
return
import sklearn
from sklearn.decomposition import PCA
from sklearn.preprocessing import StandardScaler, MinMaxScaler, RobustScaler, scale
# Find number of features required for capturing 95% variance
p95, _= find_pca(scale(X),verbosity=1)
print('Features required: ',p95, '\ni.e. Percentage of features: ',round(p95*100/X.shape[1],2),'%')
Features required: 98 i.e. Percentage of features: 76.56 %
# Find number of features required for capturing 99% variance
p99, _= find_pca(scale(X),99,verbosity=1)
print('Features required: ',p99, '\ni.e. Percentage of features: ',round(p99*100/X.shape[1],2),'%')
Features required: 115 i.e. Percentage of features: 89.84 %
# train-test split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25, stratify=y, random_state=14)
RS = 14
from sklearn.preprocessing import QuantileTransformer
Pca = PCA(n_components=103, random_state=RS, whiten=True) # to capture most important features with 95% variance
Smot = SMOTE(random_state=RS) # to handle imbalanced classes
Trans = QuantileTransformer(output_distribution='normal', random_state=RS) # Transformation to reduce outliers
MinMax = MinMaxScaler() # Scaling for pca or classifier
Scaler = RobustScaler() # Scaling for pca or classifier
#Objective: To show the standard scores required and also save them in a dataframe
def give_scores(name,model,X_train, X_test, y_train, y_test):
cvs = cross_val_score(model,pd.concat([X_train,X_test]).sort_index(),
pd.concat([y_train,y_test]).sort_index(),scoring='f1',cv=5)
cvs = cross_val_score(model,X_train,y_train,scoring='f1',cv=5)
print('CV score: ', cvs.mean().round(4))
print('\nTrain Accuracy scores: ',round(accuracy_score(y_train, model.predict(X_train)),4))
print('\nTest Accuracy scores: ',round(accuracy_score(y_test, model.predict(X_test)),4))
print('\nClassification reports of train and test set, respectively '+name)
train_report = pd.DataFrame(classification_report(y_train, model.predict(X_train),output_dict=True)).T.round(3)
test_report = pd.DataFrame(classification_report(y_test, model.predict(X_test),output_dict=True)).T.round(3)
display_sbs(train_report,test_report)
plot_confusion_matrix(model, X_test, y_test,cmap=plt.cm.Blues)
#display_labels=class_names,
save_scores(name,
[cvs.mean().round(4), cvs.std().round(4)],
test_report.loc['1','f1-score'],
train_report.loc['1','f1-score'])
return
from sklearn.metrics import classification_report, confusion_matrix , plot_confusion_matrix
from sklearn.pipeline import Pipeline, make_pipeline
from imblearn.pipeline import Pipeline
# 1.1. SVM Classifier With PCA
svc = SVC(C = 40,gamma = 0.0001, kernel='rbf',random_state=RS)
SVM_pipe1 = Pipeline([('trans',Trans),('scaler',Scaler),('pca',Pca),('smt', Smot), ('svc', svc)])
SVM_pipe1.fit(X_train,y_train)
give_scores('svc-pca',SVM_pipe1,X_train, X_test, y_train, y_test) # show the scores and save them for plotting
CV score: 0.1942 Train Accuracy scores: 0.8077 Test Accuracy scores: 0.7449 Classification reports of train and test set, respectively svc-pca
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.989 | 0.803 | 0.886 | 1097.000 |
| 1 | 0.239 | 0.872 | 0.376 | 78.000 |
| accuracy | 0.808 | 0.808 | 0.808 | 0.808 |
| macro avg | 0.614 | 0.837 | 0.631 | 1175.000 |
| weighted avg | 0.939 | 0.808 | 0.852 | 1175.000 |
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.962 | 0.757 | 0.847 | 366.000 |
| 1 | 0.144 | 0.577 | 0.231 | 26.000 |
| accuracy | 0.745 | 0.745 | 0.745 | 0.745 |
| macro avg | 0.553 | 0.667 | 0.539 | 392.000 |
| weighted avg | 0.908 | 0.745 | 0.806 | 392.000 |
# 1.2. SVM Classifier Without PCA
svc = SVC(C = 40,gamma = 0.0061, kernel='rbf',random_state=RS)
SVM_pipe2 = Pipeline([('trans',Trans),('minmax',MinMax),('smt', Smot), ('svc', svc)])
SVM_pipe2.fit(X_train,y_train)
give_scores('svc',SVM_pipe2,X_train, X_test, y_train, y_test) # show the scores and save them for plotting
CV score: 0.2112 Train Accuracy scores: 0.8587 Test Accuracy scores: 0.7628 Classification reports of train and test set, respectively svc
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.985 | 0.861 | 0.919 | 1097.000 |
| 1 | 0.296 | 0.821 | 0.435 | 78.000 |
| accuracy | 0.859 | 0.859 | 0.859 | 0.859 |
| macro avg | 0.641 | 0.841 | 0.677 | 1175.000 |
| weighted avg | 0.940 | 0.859 | 0.887 | 1175.000 |
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.953 | 0.784 | 0.861 | 366.000 |
| 1 | 0.132 | 0.462 | 0.205 | 26.000 |
| accuracy | 0.763 | 0.763 | 0.763 | 0.763 |
| macro avg | 0.543 | 0.623 | 0.533 | 392.000 |
| weighted avg | 0.899 | 0.763 | 0.817 | 392.000 |
##### 2.1. xgboost Classifier With PCA
xgb_model = XGBClassifier(min_child_weight=2, max_depth=10,learning_rate=0.03, gamma=3,
early_stopping_rounds=20, eval_metric = 'auc', verbosity = 0, random_state=RS,nthreads=-1)
##### since xgboost is robust to outliers, transformation is not required
xgb_pipe = Pipeline([('scaler',Scaler),('pca',Pca),('smt', Smot),('xgb', xgb_model)])
xgb_pipe.fit(X_train,y_train)
give_scores('xgb-pca',xgb_pipe,X_train, X_test, y_train, y_test) # show the scores and save them for plotting
CV score: 0.019 Train Accuracy scores: 0.9991 Test Accuracy scores: 0.9056 Classification reports of train and test set, respectively xgb-pca
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 1.000 | 0.999 | 1.000 | 1097.000 |
| 1 | 0.987 | 1.000 | 0.994 | 78.000 |
| accuracy | 0.999 | 0.999 | 0.999 | 0.999 |
| macro avg | 0.994 | 1.000 | 0.997 | 1175.000 |
| weighted avg | 0.999 | 0.999 | 0.999 | 1175.000 |
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.936 | 0.964 | 0.950 | 366.000 |
| 1 | 0.133 | 0.077 | 0.098 | 26.000 |
| accuracy | 0.906 | 0.906 | 0.906 | 0.906 |
| macro avg | 0.535 | 0.521 | 0.524 | 392.000 |
| weighted avg | 0.883 | 0.906 | 0.894 | 392.000 |
##### 2.2. xgboost Classifier Without pca
xgb = XGBClassifier(min_child_weight=2, max_depth=6,learning_rate=0.05, gamma=15,
early_stopping_rounds=20, eval_metric = 'auc', verbosity = 0, random_state=RS,nthreads=-1)
xgb_pipe2 = Pipeline([('smt', Smot),('xgb', xgb)])
xgb_pipe2.fit(X_train,y_train)
give_scores('xgb',xgb_pipe2,X_train, X_test, y_train, y_test) # show the scores and save them for plotting
CV score: 0.1799 Train Accuracy scores: 0.966 Test Accuracy scores: 0.8776 Classification reports of train and test set, respectively xgb
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.984 | 0.979 | 0.982 | 1097.000 |
| 1 | 0.726 | 0.782 | 0.753 | 78.000 |
| accuracy | 0.966 | 0.966 | 0.966 | 0.966 |
| macro avg | 0.855 | 0.881 | 0.867 | 1175.000 |
| weighted avg | 0.967 | 0.966 | 0.967 | 1175.000 |
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.939 | 0.929 | 0.934 | 366.000 |
| 1 | 0.133 | 0.154 | 0.143 | 26.000 |
| accuracy | 0.878 | 0.878 | 0.878 | 0.878 |
| macro avg | 0.536 | 0.541 | 0.538 | 392.000 |
| weighted avg | 0.886 | 0.878 | 0.882 | 392.000 |
from sklearn.linear_model import LogisticRegressionCV
# 3.1 Logistic regression with pca
# with class_weight=balanced, smote doesn't have much impact, thus removed
LR_cv = LogisticRegressionCV( scoring = 'f1', random_state=RS, class_weight='balanced',
verbose=0, n_jobs=-1, max_iter=10000)
LR_model1 = Pipeline([('trans',Trans),('scaler',Scaler),('pca',Pca),('LR', LR_cv)])
LR_model1.fit(X_train,y_train)
give_scores('LR-pca',LR_model1,X_train, X_test, y_train, y_test) # show the scores and save them for plotting
CV score: 0.2054 Train Accuracy scores: 0.7991 Test Accuracy scores: 0.7143 Classification reports of train and test set, respectively LR-pca
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.980 | 0.801 | 0.882 | 1097.000 |
| 1 | 0.216 | 0.769 | 0.337 | 78.000 |
| accuracy | 0.799 | 0.799 | 0.799 | 0.799 |
| macro avg | 0.598 | 0.785 | 0.609 | 1175.000 |
| weighted avg | 0.929 | 0.799 | 0.845 | 1175.000 |
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.954 | 0.730 | 0.827 | 366.000 |
| 1 | 0.116 | 0.500 | 0.188 | 26.000 |
| accuracy | 0.714 | 0.714 | 0.714 | 0.714 |
| macro avg | 0.535 | 0.615 | 0.508 | 392.000 |
| weighted avg | 0.898 | 0.714 | 0.784 | 392.000 |
# 3.2 Logistic regression without pca and scaling
LR_cv = LogisticRegressionCV( scoring = 'f1', random_state=RS, class_weight='balanced',
verbose=0, n_jobs=-1, max_iter=10000)
LR_model2 = Pipeline([('trans',Trans),('LR', LR_cv)])
LR_model2.fit(X_train,y_train)
give_scores('LR',LR_model2,X_train, X_test, y_train, y_test) # show the scores and save them for plotting
CV score: 0.2049 Train Accuracy scores: 0.8077 Test Accuracy scores: 0.7219 Classification reports of train and test set, respectively LR
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.989 | 0.803 | 0.886 | 1097.000 |
| 1 | 0.239 | 0.872 | 0.376 | 78.000 |
| accuracy | 0.808 | 0.808 | 0.808 | 0.808 |
| macro avg | 0.614 | 0.837 | 0.631 | 1175.000 |
| weighted avg | 0.939 | 0.808 | 0.852 | 1175.000 |
| precision | recall | f1-score | support | |
|---|---|---|---|---|
| 0 | 0.957 | 0.735 | 0.832 | 366.000 |
| 1 | 0.126 | 0.538 | 0.204 | 26.000 |
| accuracy | 0.722 | 0.722 | 0.722 | 0.722 |
| macro avg | 0.542 | 0.637 | 0.518 | 392.000 |
| weighted avg | 0.902 | 0.722 | 0.790 | 392.000 |
Overall performace
Best f1 score : xgb without pca
Best cv score: SVM without PCA
Best precision : xb with pca
Best Recall : Logistic regression with pca
# Objective: To plot the scores from different models saved in the 'scores' dataframe
def plot_scores(df=None):
global scores
if df==None: df = scores
fig, ax = plt.subplots(figsize=(10,15))
colors = ['orange','green', 'blue', 'red', 'yellow']
rectangles=[]
N =len(df.columns)
ind = np.arange(N)
xlabels = df.columns
width = 0.2 # the width of the bars
ax.set_yticks(ind + width)
ax.set_yticklabels(xlabels,fontsize=10)
ax.set_ylabel("Models", fontsize=12)
ax.set_xlabel("scores", fontsize=12)
ax.set_title('scores with different Models')
def labelvalues(rects):
for rect in rects:
height = rect.get_width()*100
ax.text(height/100, rect.get_y() + rect.get_height()/2., '{0:1.2f}'.format(height),va='center', ha='left')
for i in range(df.shape[0]):
rectangles.append(ax.barh(ind+width*i, df.iloc[i,:], width, color=colors[i]))
labelvalues(rectangles[i])
rect_leg = [item[0] for item in rectangles]
rect_leg.reverse()
scor = df.index.tolist()
scor.reverse()
ax.legend((rect_leg),(scor),bbox_to_anchor=(1.13, 1.01))
plt.show()
global cv_scores
fig = go.Figure(data=go.Scatter(
x=cv_scores.columns.tolist(),y=cv_scores.loc['mean'],
error_y=dict(type='data', array=cv_scores.loc['std'], visible=True)))
fig.update_layout(title='CV scores with stadard deviation for different models',
yaxis_zeroline=False, xaxis_zeroline=False)
fig.show()
return
plot_scores()
import pickle
filename = 'finalized_model.sav'
pickle.dump(model, open(filename, 'wb'))
Thus, other than xgb-pca, all other algorithms have almost similar mean cv score.
fdata.shape
(18, 591)
a=list(pdata3.columns)
df1 = fdata[[0,1,2,3,4,6,12,14,15,16,18,21,23,24,27,28,29,31,32,33,38,40,41,43,45,48,51,55,59,62,63,64,67,68,71,83,88,90,98,115,
117,122,129,133,134,135,136,137,138,139,142,150,151,155,159,160,161,162,166,167,180,182,183,185,188,195,200,201,208,218,
223,225,250,268,269,416,417,418,419,423,426,429,432,433,438,439,442,453,460,468,472,476,482,483,484,485,486,487,
488,489,491,493,494,496,499,500,510,511,520,521,523,525,526,527,539,545,546,547,548,550,561,562,564,569,570,572,585,
589]]
df1
| 0 | 1 | 2 | 3 | 4 | 6 | 12 | 14 | 15 | 16 | 18 | 21 | 23 | 24 | 27 | 28 | 29 | 31 | 32 | 33 | 38 | 40 | 41 | 43 | 45 | 48 | 51 | 55 | 59 | 62 | 63 | 64 | 67 | 68 | 71 | 83 | 88 | 90 | 98 | 115 | 117 | 122 | 129 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 150 | 151 | 155 | 159 | 160 | 161 | 162 | 166 | 167 | 180 | 182 | 183 | 185 | 188 | 195 | 200 | 201 | 208 | 218 | 223 | 225 | 250 | 268 | 269 | 416 | 417 | 418 | 419 | 423 | 426 | 429 | 432 | 433 | 438 | 439 | 442 | 453 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 491 | 493 | 494 | 496 | 499 | 500 | 510 | 511 | 520 | 521 | 523 | 525 | 526 | 527 | 539 | 545 | 546 | 547 | 548 | 550 | 561 | 562 | 564 | 569 | 570 | 572 | 585 | 589 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 3030.93 | 2564.00 | 2187.7333 | 1411.1265 | 1.3602 | 97.6133 | 202.4396 | 7.9558 | 414.8710 | 10.0433 | 192.3963 | -5419.00 | -4043.75 | 751.00 | 3.0490 | 64.2333 | 2.0222 | 3.5191 | 83.3971 | 9.5126 | 86.9555 | 61.29 | 4.515 | 352.7173 | 130.3691 | 141.2282 | 218.3174 | 2834 | -1.7264 | 108.6427 | 16.1445 | 21.7264 | 0.9226 | 148.6009 | 84.0793 | 7.2163 | 1747.6049 | 8671.9301 | 0.3974 | 748.6115 | 58.4306 | 2.639 | -0.0473 | 1000.7263 | 39.2373 | 123 | 111.3 | 75.2 | 46.2000 | 350.6710 | 6.78 | 4.271 | 10.284 | 0.41 | 1017 | 967 | 1066 | 368 | 2.0 | 0.9 | 20.95 | 12.49 | 16.713 | 5.72 | 65.363 | 0.292 | 10.30 | 5.38 | 97.314 | 3.4789 | 175.2173 | 1940.3994 | 219.9453 | 40.855 | 4.5152 | 2.7380 | 5.9846 | 525.0965 | 0.0000 | 53.6840 | 1.7275 | 3.6084 | 26.3617 | 49.0013 | 44.5055 | 42.2737 | 1.1975 | 3.2698 | 29.9394 | 311.6377 | 63.7987 | 31.9893 | 613.3069 | 291.4842 | 494.6996 | 178.1759 | 843.1138 | 0.0000 | 53.1098 | 0.0000 | 0.7578 | 2.9570 | 2.1739 | 17.1202 | 0.0000 | 0.0000 | 64.6707 | 0.0000 | 1.9864 | 0 | 0.1094 | 3.1406 | 0.5064 | 6.6926 | 2.0570 | 7.116 | 1.0616 | 395.570 | 75.752 | 12.93 | 42.3877 | NaN | NaN | NaN | 533.8500 | 8.9500 | 2.3630 | NaN |
| 1 | 3095.78 | 2465.14 | 2230.4222 | 1463.6606 | 0.8294 | 102.3433 | 200.5470 | 10.1548 | 414.7347 | 9.2599 | 191.2872 | -5441.50 | -3498.75 | -1640.25 | 7.3900 | 68.4222 | 2.2667 | 3.4171 | 84.9052 | 9.7997 | 87.5241 | 78.25 | 2.773 | 352.2445 | 133.1727 | 145.8445 | 205.1695 | 2853 | 0.8073 | 113.9800 | 10.9036 | 19.1927 | 1.1598 | 154.3709 | 82.3494 | 6.8043 | 1931.6464 | 8407.0299 | -0.9353 | 731.2517 | 58.6680 | 2.541 | -0.0946 | 998.1081 | 37.9213 | 98 | 80.3 | 81.0 | 56.2000 | 219.7679 | 5.70 | 6.285 | 13.077 | 0.35 | 568 | 59 | 297 | 3277 | 2.2 | 1.1 | 17.99 | 10.14 | 16.358 | 6.92 | 82.986 | 0.222 | 8.02 | 3.74 | 134.250 | 3.9578 | 128.4285 | 1988.0000 | 193.0287 | 29.743 | 3.6327 | 3.9300 | 9.0604 | 0.0000 | 368.9713 | 61.8918 | 1.4857 | 3.1595 | 8.4887 | 199.7866 | 48.5294 | 37.5793 | 1.9562 | 4.3737 | 40.4475 | 463.2883 | 73.5536 | 30.8643 | 0.0000 | 246.7762 | 0.0000 | 359.0444 | 130.6350 | 820.7900 | 194.4371 | 0.0000 | 3.6822 | 3.2029 | 0.1441 | 12.6788 | 0.0000 | 0.0000 | 141.4365 | 0.0000 | 1.6292 | 0 | 0.0673 | 3.1310 | 0.8832 | 8.8370 | 1.7910 | 7.116 | 1.3526 | 408.798 | 74.640 | 16.00 | 18.1087 | NaN | NaN | NaN | 535.0164 | 5.9200 | 4.4447 | 208.2045 |
| 2 | 2932.61 | 2559.94 | 2186.4111 | 1698.0172 | 1.5102 | 95.4878 | 202.0179 | 9.5157 | 416.7075 | 9.3144 | 192.7035 | -5447.75 | -4047.00 | -1916.50 | 7.5788 | 67.1333 | 2.3333 | 3.5986 | 84.7569 | 8.6590 | 84.7327 | 14.37 | 5.434 | 364.3782 | 131.8027 | 141.0845 | 185.7574 | 2936 | 23.8245 | 115.6273 | 11.3019 | 16.1755 | 0.8694 | 145.8000 | 84.7681 | 7.1041 | 1685.8514 | 9317.1698 | -0.1427 | 718.5777 | 58.4808 | 2.882 | -0.1892 | 998.4440 | 42.0579 | 89 | 126.4 | 96.5 | 45.1001 | 306.0380 | 8.33 | 4.819 | 8.443 | 0.47 | 562 | 788 | 759 | 2100 | 2.1 | 1.4 | 17.78 | 13.31 | 22.912 | 9.21 | 60.110 | 0.139 | 16.73 | 5.09 | 79.618 | 2.4266 | 182.4956 | 839.6006 | 104.4042 | 29.621 | 3.9133 | 3.0609 | 5.2231 | 0.0000 | 0.0000 | 50.6425 | 1.8268 | 3.5220 | 18.7546 | 109.5747 | 60.0000 | 70.9161 | 0.4264 | 7.5418 | 32.3594 | 21.3645 | 148.0287 | 13.3923 | 434.2674 | 151.7665 | 0.0000 | 190.3869 | 746.9150 | 74.0741 | 191.7582 | 250.1742 | 1.0281 | 3.9238 | 1.5357 | 18.9849 | 0.0000 | 0.0000 | 240.7767 | 244.2748 | 2.9626 | 0 | 0.0751 | 12.1831 | 0.6451 | 6.4568 | 2.1538 | 7.116 | 0.7942 | 411.136 | 74.654 | 16.16 | 24.7524 | 267.064 | 1.10 | 68.8489 | 535.0245 | 11.2100 | 3.1745 | 82.8602 |
| 3 | 2988.72 | 2479.90 | 2199.0333 | 909.7926 | 1.3204 | 104.2367 | 201.8482 | 9.6052 | 422.2894 | 9.6924 | 192.1557 | -5468.25 | -4515.00 | -1657.25 | 7.3145 | 62.9333 | 2.6444 | 3.3813 | 84.9105 | 8.6789 | 86.6867 | 76.90 | 1.279 | 363.0273 | 131.8027 | 142.5427 | 189.9079 | 2936 | 24.3791 | 116.1818 | 13.5597 | 15.6209 | 0.9761 | 147.6545 | 70.2289 | 7.5925 | 1752.0968 | 8205.7000 | 0.0177 | 709.0867 | 58.6635 | 3.132 | 0.2838 | 980.4510 | 41.1025 | 127 | 118.0 | 123.7 | 47.8000 | 162.4320 | 5.51 | 9.073 | 15.241 | 0.35 | 859 | 355 | 3433 | 3004 | 1.7 | 0.9 | 16.22 | 14.67 | 22.562 | 5.69 | 52.571 | 0.139 | 13.56 | 5.92 | 104.950 | 5.5398 | 152.0885 | 820.3999 | 94.0954 | 31.830 | 3.1959 | 2.4643 | 7.6602 | 317.7362 | 0.0000 | 94.4594 | 1.5441 | 4.9898 | 76.0354 | 181.2641 | 34.0336 | 41.5236 | 0.4097 | 6.9785 | 27.6824 | 24.2831 | 100.0021 | 35.4323 | 225.0169 | 100.4883 | 305.7500 | 88.5553 | 104.6660 | 71.7583 | 0.0000 | 336.7660 | 1.7670 | 3.1817 | 0.1488 | 29.2542 | 0.0000 | 711.6418 | 113.5593 | 0.0000 | 2.4416 | 0 | 0.0977 | 6.7553 | 0.7404 | 6.4865 | 2.1565 | 7.116 | 1.1650 | 372.822 | 72.442 | 131.68 | 62.7572 | 268.228 | 7.32 | 25.0363 | 530.5682 | 9.3300 | 2.0544 | 73.8432 |
| 4 | 3032.24 | 2502.87 | 2233.3667 | 1326.5200 | 1.5334 | 100.3967 | 201.9424 | 10.5661 | 420.5925 | 10.3387 | 191.6037 | -5476.25 | -3987.50 | 117.00 | 7.2748 | 62.8333 | 3.1556 | 3.2728 | 86.3269 | 8.7677 | 86.1468 | 76.39 | 2.209 | 353.3400 | 176.3136 | 138.0882 | 233.5491 | 2865 | -12.2945 | 144.0191 | 21.9782 | 32.2945 | 0.9256 | 146.6636 | 65.8417 | 7.5017 | 1828.3846 | 9014.4600 | -0.6704 | 796.5950 | 58.3858 | 3.148 | -0.5677 | 993.1274 | 38.1448 | 119 | 143.2 | 123.1 | 48.8000 | 296.3030 | 3.64 | 9.005 | 12.506 | 0.43 | 699 | 283 | 1747 | 1443 | 3.9 | 0.8 | 15.24 | 10.85 | 37.715 | 3.98 | 72.149 | 0.250 | 19.77 | 5.52 | 92.307 | 4.1338 | 69.1510 | 1406.4004 | 149.2172 | 19.862 | 3.6163 | 3.3208 | 4.2178 | 0.0000 | 866.0295 | 85.2255 | 1.2943 | 3.8754 | 43.8119 | 0.0000 | 25.3521 | 37.4691 | 0.7198 | 2.7092 | 30.8924 | 44.8980 | 89.9529 | 42.6838 | 171.4486 | 276.8810 | 461.8619 | 240.1781 | 0.0000 | 587.3773 | 748.1781 | 0.0000 | 2.2358 | 3.2712 | 0.0372 | 107.6905 | 293.1396 | 0.0000 | 148.0663 | 0.0000 | 2.5512 | 0 | 0.0616 | 2.9954 | 2.2181 | 6.3745 | 2.0579 | 7.116 | 1.4636 | 399.914 | 79.156 | 19.63 | 22.0500 | NaN | NaN | NaN | 532.0155 | 8.8300 | 99.3032 | 73.8432 |
| 5 | 2946.25 | 2432.84 | 2233.3667 | 1326.5200 | 1.5334 | 100.3967 | 200.4720 | 8.6617 | 414.2426 | 9.2441 | 191.2280 | -6058.00 | -3906.50 | 193.75 | 3.0505 | 62.3778 | 1.6333 | 3.5200 | 85.4233 | 9.6484 | 87.0273 | 59.94 | 3.024 | 360.2873 | 142.2591 | 137.6473 | 192.7702 | 2865 | 28.2955 | 130.5545 | 7.6180 | 11.7045 | 0.9227 | 144.6982 | 70.3707 | 6.7509 | 1832.6305 | 7869.7000 | -0.2240 | 748.0887 | 58.6560 | 2.541 | -0.0946 | 998.1081 | 37.9213 | 123 | 159.8 | 119.3 | 48.8000 | 296.3030 | 3.64 | 5.369 | 10.420 | 0.45 | 2089 | 910 | 323 | 1157 | 1.7 | 0.9 | 19.70 | 14.38 | 35.535 | 7.92 | 82.556 | 0.244 | 21.78 | 6.61 | 111.952 | 2.8799 | 129.1731 | 1132.2998 | 158.2729 | 29.743 | 3.6327 | 3.3208 | 4.6380 | 347.7740 | 865.1584 | 61.9853 | 1.0841 | 3.6904 | 8.2683 | 597.1613 | 55.1020 | 76.0159 | 3.2918 | 4.9282 | 42.8261 | 23.3606 | 285.9000 | 56.5390 | 325.3468 | 71.0586 | 0.0000 | 826.9360 | 0.0000 | 78.3348 | 723.4234 | 273.8095 | 8.2199 | 2.7883 | 0.2873 | 14.7656 | 0.0000 | 0.0000 | 68.7351 | 622.2222 | 2.1027 | 0 | 0.0706 | 4.4060 | 2.9077 | 7.0970 | 1.7910 | 7.116 | 1.2708 | 412.222 | 80.326 | 16.06 | 30.6277 | 254.006 | 4.75 | 22.5598 | 534.2091 | 8.9100 | 3.8276 | 44.0077 |
| 6 | 3030.27 | 2430.12 | 2230.4222 | 1463.6606 | 0.8294 | 102.3433 | 202.0901 | 9.0350 | 415.8852 | 9.9990 | 192.0912 | -6154.00 | -3914.75 | 580.25 | 3.0515 | 60.8000 | 2.4889 | 3.4927 | 87.1543 | 9.1502 | 86.3740 | 74.46 | 3.978 | 352.1836 | 132.3455 | 146.4855 | 201.7609 | 2853 | 2.1109 | 114.4564 | 12.7400 | 17.8891 | 1.1150 | 152.3582 | 83.4997 | 8.4126 | 1740.6472 | 9238.3601 | -0.1940 | 806.2734 | 58.8230 | 2.575 | -0.4731 | 995.1991 | 36.9578 | 89 | 173.7 | 122.5 | 56.2000 | 219.7679 | 5.70 | 8.255 | 10.965 | 0.35 | 2117 | 643 | 441 | 565 | 3.0 | 0.6 | 18.27 | 10.81 | 15.840 | 5.04 | 47.803 | 0.222 | 14.05 | 3.43 | 114.350 | 1.9062 | 65.3582 | 1046.2002 | 105.5219 | 29.099 | 2.8000 | 3.9300 | 6.0381 | 326.1524 | 467.5756 | 91.3669 | 1.9544 | 6.1127 | 11.2651 | 97.3718 | 24.1071 | 55.7395 | 0.3887 | 4.7207 | 23.6929 | 162.4892 | 110.2830 | 29.9535 | 266.1526 | 509.0909 | 110.6781 | 189.4602 | 586.3636 | 0.0000 | 355.7932 | 304.5139 | 2.6373 | 3.6587 | 0.3268 | 41.0596 | 0.0000 | 723.3853 | 159.5238 | 748.3871 | 4.0505 | 0 | 0.0988 | 4.7340 | 0.5157 | 8.4653 | 1.4336 | 7.116 | 0.5051 | 404.356 | 72.040 | 13.73 | 51.4535 | NaN | NaN | NaN | 541.9036 | 6.4800 | 2.8515 | 44.0077 |
| 7 | 3058.88 | 2690.15 | 2248.9000 | 1004.4692 | 0.7884 | 106.2400 | 202.4170 | 13.6872 | 408.4017 | 9.6836 | 192.7334 | -5395.50 | -3819.75 | -737.00 | 7.3885 | 67.3000 | 2.7889 | 3.4588 | 83.8887 | 8.7289 | 86.7493 | 78.09 | 2.671 | 358.6700 | 142.2945 | 136.6791 | 133.6606 | 2865 | 29.1309 | 131.4255 | 5.5936 | 10.8691 | 0.9965 | 144.3255 | 57.0841 | 6.2166 | 1826.1111 | 9117.3699 | -0.1188 | 778.3335 | 59.0421 | 2.270 | 0.7096 | 992.8298 | 40.5458 | 91 | 175.4 | 201.1 | 55.5000 | 249.9639 | 7.56 | 19.388 | 14.952 | 0.48 | 233 | 346 | 1429 | 807 | 1.9 | 1.1 | 13.92 | 14.09 | 37.242 | 7.61 | 11.961 | 0.244 | 12.63 | 4.72 | 73.632 | 7.5616 | 126.1031 | 1222.1006 | 128.7632 | 27.861 | 2.9057 | 2.8699 | 6.7774 | 399.3620 | 0.0000 | 141.6508 | 1.6483 | 3.3651 | 37.4108 | 109.4980 | 39.4422 | 32.6055 | 1.1227 | 3.9589 | 8.9488 | 16.2027 | 225.7941 | 71.7657 | 374.6159 | 89.1323 | 484.1046 | 0.0000 | 0.0000 | 129.2388 | 0.0000 | 440.8163 | 2.5556 | 3.8989 | 0.0527 | 47.8698 | 0.0000 | 0.0000 | 79.1209 | 0.0000 | 1.6426 | 0 | 0.1232 | 4.2075 | 2.6891 | 6.9889 | 1.4141 | 7.116 | 0.6174 | 408.116 | 73.856 | 15.97 | 43.0771 | 265.090 | 7.78 | 23.6052 | 493.0054 | 278.1900 | 2.1261 | 95.0310 |
| 8 | 2967.68 | 2600.47 | 2248.9000 | 1004.4692 | 0.7884 | 106.2400 | 202.4544 | 12.6837 | 417.6009 | 9.7046 | 192.7498 | -4196.50 | -2948.00 | 622.75 | 2.2675 | 62.8000 | 2.1444 | 3.5067 | 85.4274 | 9.6234 | 86.9476 | 61.10 | 3.217 | 358.9564 | 142.7009 | 138.3109 | 216.9538 | 2865 | 26.9791 | 129.6800 | 8.8612 | 13.0209 | 0.9079 | 144.5455 | 74.3992 | 7.6346 | 1788.8624 | 8615.2700 | 0.1975 | 748.5630 | 58.7785 | 2.270 | 0.7096 | 992.8298 | 40.5458 | 91 | 136.1 | 116.8 | 55.5000 | 249.9639 | 7.56 | 7.245 | 16.973 | 0.48 | 5781 | 2900 | 3961 | 948 | 2.4 | 0.6 | 12.48 | 11.45 | 31.969 | 5.63 | 81.637 | 0.244 | 16.32 | 5.09 | 90.715 | 2.8877 | 100.5114 | 740.3994 | 61.5139 | 27.861 | 2.9057 | 2.8699 | 6.1530 | 496.3964 | 624.0602 | 57.1204 | 1.5571 | 3.9248 | 134.3623 | 152.2280 | 27.9793 | 38.4940 | 2.9979 | 3.6794 | 37.6287 | 18.8665 | 184.1738 | 44.9089 | 477.0328 | 106.1693 | 51.4694 | 892.1933 | 616.8421 | 86.1931 | 364.0719 | 155.6827 | 2.5556 | 1.7466 | 2.9308 | 25.8449 | 0.0000 | 686.5784 | 138.5827 | 395.2941 | 2.3441 | 0 | 0.0727 | 6.5231 | 2.0667 | 6.7934 | 1.4141 | 7.116 | 0.6710 | 408.468 | 74.016 | 17.54 | 13.9158 | 265.184 | 6.28 | 18.2120 | 535.1818 | 7.0900 | 3.4456 | 111.6525 |
| 9 | 3016.11 | 2428.37 | 2248.9000 | 1004.4692 | 0.7884 | 106.2400 | 202.5999 | 12.4278 | 413.3677 | 9.7046 | 192.8953 | -5357.50 | -3911.50 | 873.75 | 3.0995 | 67.0333 | 2.3778 | 3.5207 | 83.8887 | 8.7289 | 86.7493 | 78.09 | 2.671 | 360.6945 | 141.9964 | 137.1391 | 194.3669 | 2865 | 28.3118 | 130.3082 | 6.8983 | 11.6882 | 1.0028 | 147.1409 | 65.5040 | 7.4956 | 1712.8033 | 9126.7200 | 0.3639 | 786.6071 | 58.3664 | 2.270 | 0.7096 | 992.8298 | 40.5458 | 123 | 175.8 | 211.9 | 55.5000 | 249.9639 | 7.56 | 13.545 | 18.186 | 0.48 | 862 | 882 | 422 | 186 | 2.5 | 0.8 | 14.90 | 14.36 | 36.258 | 10.29 | 43.062 | 0.244 | 5.58 | 6.18 | 109.867 | 4.2936 | 138.5192 | 1266.0000 | 123.1775 | 27.861 | 2.9057 | 2.8699 | 4.7137 | 397.4922 | 979.2574 | 108.9894 | 1.8549 | 3.0589 | 10.7887 | 21.2876 | 33.6449 | 45.7453 | 1.1227 | 5.2213 | 22.1550 | 21.8283 | 80.8892 | 63.2981 | 790.5782 | 104.0040 | 979.8817 | 816.9231 | 497.6030 | 119.3208 | 115.0100 | 0.0000 | 3.2587 | 3.5500 | 0.6710 | 44.0988 | 0.0000 | 0.0000 | 86.0068 | 0.0000 | 2.6612 | 0 | 0.1007 | 7.4717 | 0.4676 | 7.4985 | 1.4141 | 7.116 | 0.9069 | 396.504 | 75.060 | 15.15 | 20.9776 | 265.206 | 7.04 | 5.8617 | 533.4200 | 3.5400 | 3.0687 | 90.2294 |
| 10 | 2994.05 | 2548.21 | 2195.1222 | 1046.1468 | 1.3204 | 103.3400 | 201.7125 | 11.8566 | 411.9572 | 10.2918 | 191.4207 | -2727.50 | -1976.00 | 360.50 | 1.5320 | 68.7111 | 1.9444 | 3.5029 | 84.7569 | 8.6590 | 84.7327 | 14.37 | 5.434 | 366.1545 | 120.2818 | 138.1345 | 113.0565 | 2891 | 27.8373 | 108.1191 | 7.8579 | 12.1627 | 0.8692 | 144.5955 | 112.0789 | 7.4902 | 1732.2002 | 8476.1700 | -0.1401 | 636.9547 | 58.5394 | 3.148 | -0.5677 | 993.1274 | 38.1448 | 89 | 170.8 | 146.2 | 53.0999 | 380.5909 | 5.82 | 9.660 | 10.227 | 0.46 | 5655 | 2942 | 3967 | 776 | 2.2 | 1.0 | 16.15 | 13.20 | 23.319 | 3.04 | 34.253 | 4.617 | 17.24 | 4.07 | 118.057 | 2.3422 | 68.0260 | 1104.5000 | 138.0032 | 19.862 | 3.6163 | 3.0263 | 8.7623 | 269.3851 | 0.0000 | 81.4734 | 1.8942 | 4.3945 | 200.7591 | 215.2566 | 51.4286 | 70.0671 | 0.4264 | 3.1283 | 30.2973 | 14.6207 | 219.3969 | 24.5336 | 0.0000 | 99.4809 | 0.0000 | 162.1884 | 0.0000 | 97.2892 | 0.0000 | 293.8947 | 2.5832 | 3.9238 | 0.8832 | 42.7355 | 474.3363 | 615.0150 | 87.1111 | 0.0000 | 4.7811 | 0 | 0.1140 | 4.8734 | 0.3783 | 6.1550 | 2.0579 | 7.116 | 0.7517 | 411.648 | 74.512 | 17.93 | 53.1570 | 265.922 | 9.26 | 15.4411 | 532.1764 | 9.5699 | 3.2115 | 57.8122 |
| 11 | 2928.84 | 2479.40 | 2196.2111 | 1605.7578 | 0.9959 | 97.9156 | 202.1264 | 9.1084 | 419.9018 | 10.1130 | 192.0134 | -6097.75 | -3869.75 | 752.50 | 3.0345 | 64.7667 | 2.0667 | 3.5217 | 85.4274 | 9.6234 | 86.9476 | 61.10 | 3.217 | 360.7891 | 143.8782 | 136.8118 | 213.6180 | 2865 | 25.0836 | 128.9618 | 9.4424 | 14.9164 | 0.9241 | 144.1700 | 63.7271 | 6.6383 | 1632.3120 | 8789.5001 | 0.0000 | 725.8145 | 59.5776 | 2.661 | -0.5677 | 997.4308 | 45.1318 | 123 | 112.8 | 150.5 | 51.7000 | 250.3359 | 5.54 | 4.678 | 11.597 | 0.48 | 2352 | 597 | 295 | 143 | 2.3 | 0.9 | 15.08 | 13.48 | 36.218 | 7.17 | 72.888 | 0.244 | 13.78 | 2.08 | 73.696 | 3.9528 | 104.4568 | 740.5996 | 53.6078 | 27.537 | 4.1301 | 3.8989 | 7.4677 | 569.2818 | 236.8442 | 51.3594 | 1.7250 | 2.6594 | 7.6232 | 19.0033 | 43.5484 | 34.9272 | 2.9979 | 4.1735 | 34.1207 | 8.2922 | 145.9369 | 51.7796 | 613.8337 | 73.8120 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 141.3793 | 258.5859 | 0.7578 | 3.1045 | 0.4323 | 26.8017 | 935.7644 | 0.0000 | 67.9406 | 0.0000 | 4.6512 | 0 | 0.0625 | 5.2886 | 1.8407 | 7.2606 | 1.5371 | 7.116 | 1.2990 | 399.092 | 75.820 | 14.32 | 42.3877 | 264.188 | 3.32 | 13.0129 | 533.7464 | 7.7400 | 8.5646 | 75.5077 |
| 12 | 2920.07 | 2507.40 | 2195.1222 | 1046.1468 | 1.3204 | 103.3400 | 202.1269 | 8.4828 | 415.5185 | 9.5007 | 192.6261 | -5521.25 | -3982.25 | -1933.25 | 7.4027 | 63.3778 | 2.8222 | 3.5298 | 84.9105 | 8.6789 | 86.6867 | 76.90 | 1.279 | 363.0364 | 137.3564 | 138.1709 | 228.0825 | 2936 | 22.6227 | 119.9791 | 11.0227 | 17.3773 | 0.9556 | 142.5155 | 80.1463 | 7.5539 | 1891.4401 | 7889.7200 | 0.1172 | 670.3646 | 58.7200 | 2.575 | -0.4731 | 995.1991 | 36.9578 | 127 | 64.1 | 71.9 | 53.0999 | 380.5909 | 5.82 | 7.549 | 9.642 | 0.40 | 971 | 302 | 1437 | 3851 | 1.7 | 0.9 | 16.06 | 14.72 | 25.722 | 8.33 | 70.642 | 0.139 | 17.96 | 7.11 | 114.368 | 3.3281 | 71.3541 | 1541.7002 | 159.7564 | 29.099 | 2.8000 | 3.0263 | 8.7131 | 288.9425 | 259.5720 | 88.9917 | 1.9245 | 1.5100 | 36.0851 | 199.1982 | 31.8898 | 49.9301 | 0.4097 | 6.4792 | 30.9721 | 31.4286 | 162.9366 | 24.1399 | 406.6335 | 46.8246 | 57.6480 | 262.5493 | 0.0000 | 76.2956 | 672.8421 | 455.7599 | 4.0806 | 3.8574 | 0.6807 | 56.6776 | 0.0000 | 0.0000 | 94.8247 | 0.0000 | 1.4222 | 0 | 0.0713 | 6.2801 | 0.5659 | 6.8260 | 1.4336 | 7.116 | 1.8373 | 408.824 | 81.582 | 17.48 | 62.7572 | NaN | NaN | NaN | 530.1800 | 9.9500 | 3.0926 | 52.2039 |
| 13 | 3051.44 | 2529.27 | 2184.4333 | 877.6266 | 1.4668 | 107.8711 | 226.0086 | 9.7686 | 409.8885 | 10.4109 | 215.5977 | -6069.75 | -3883.50 | 833.75 | 3.1158 | 67.9889 | 2.3000 | 3.5296 | 83.7998 | 8.7977 | 86.6693 | 78.11 | 3.872 | 350.1745 | 133.2682 | 137.9873 | 112.7524 | 2872 | -1.1927 | 112.0755 | 12.6400 | 21.1927 | 0.9458 | 145.2836 | 104.6736 | 7.8046 | 1706.2759 | 8889.3000 | 0.0387 | 756.9414 | 59.5615 | 3.038 | 0.0000 | 998.4607 | 38.6257 | 90 | 102.5 | 80.2 | 52.8000 | 203.0710 | 8.40 | 4.392 | 6.621 | 12.71 | 2413 | 648 | 692 | 216 | 3.5 | 1.0 | 21.30 | 12.06 | 20.512 | 5.02 | 38.776 | 0.231 | 12.63 | 3.57 | 36.619 | 5.1150 | 129.6200 | 846.5996 | 77.8939 | 38.321 | 4.2300 | 2.2581 | 8.6638 | 301.7214 | 0.0000 | 44.9602 | 3.5863 | 4.1112 | 17.8190 | 25.9070 | 43.4783 | 62.2874 | 1.0565 | 2.4834 | 34.3904 | 299.3140 | 99.9209 | 6.6225 | 791.0280 | 50.0994 | 521.8437 | 170.8835 | 221.0019 | 466.6667 | 139.8916 | 644.5040 | 0.7578 | 3.1045 | 0.1816 | 20.6583 | 719.4245 | 0.0000 | 55.0265 | 0.0000 | 1.1444 | 0 | 0.0882 | 3.9474 | 3.1311 | 6.2772 | 1.1299 | 7.116 | 0.8668 | 400.008 | 73.326 | 14.44 | 13.9158 | 258.406 | 4.52 | 23.8168 | 533.2464 | 9.9300 | 3.0063 | 52.2039 |
| 14 | 2963.97 | 2629.48 | 2224.6222 | 947.7739 | 1.2924 | 104.8489 | 195.3787 | 9.7561 | 422.3675 | 9.7825 | 185.5963 | -1431.50 | -1025.25 | 131.75 | 0.7672 | 62.5000 | 2.4222 | 3.4603 | 85.0922 | 9.2007 | 88.3312 | 74.57 | 1.946 | 354.2945 | 138.2800 | 137.0027 | 133.2749 | 2887 | 17.6382 | 115.9182 | 10.2273 | 22.3618 | 1.0144 | 150.1191 | 78.7486 | 7.3445 | 1839.8450 | 8651.3099 | -0.4609 | 736.1852 | 56.5635 | 3.038 | 0.0000 | 998.4607 | 38.6257 | 70 | 209.2 | 197.9 | 64.0999 | 295.5630 | 7.99 | 7.568 | 10.272 | 0.48 | 5726 | 2478 | 4101 | 527 | 2.2 | 1.0 | 18.54 | 7.73 | 22.958 | 9.59 | 32.266 | 0.355 | 17.34 | 3.91 | 66.817 | 3.3189 | 157.0715 | 959.7998 | 148.2980 | 38.321 | 4.2300 | 3.5078 | 5.0503 | 692.5477 | 385.4507 | 77.5720 | 1.4406 | 3.5155 | 400.0000 | 400.0000 | 41.2844 | 54.2108 | 1.6445 | 7.3162 | 24.2101 | 22.1678 | 169.5462 | 15.7680 | 901.9024 | 98.8254 | 0.0000 | 162.9900 | 152.3560 | 81.9411 | 398.1250 | 320.2966 | 1.8655 | 3.5795 | 5.1800 | 71.0696 | 344.4976 | 697.8723 | 125.5385 | 728.8889 | 12.2560 | 0 | 2.0882 | 5.1066 | 1.8764 | 4.8988 | 1.1299 | 7.116 | 1.0620 | 399.004 | 81.434 | 14.17 | 13.4909 | NaN | NaN | NaN | 532.6446 | 11.4200 | 1.8483 | 142.9080 |
| 15 | 2988.31 | 2546.26 | 2224.6222 | 947.7739 | 1.2924 | 104.8489 | 192.9787 | 12.4364 | 424.7536 | 9.5917 | 183.3869 | -2694.75 | -2053.00 | 81.25 | 1.4915 | 61.2333 | 2.2444 | 3.4814 | 85.0922 | 9.2007 | 88.3312 | 74.57 | 1.946 | 352.6455 | 141.3000 | 136.0836 | 141.8843 | 2875 | 20.4209 | 121.7209 | 5.9718 | 19.5791 | 0.9025 | 150.3127 | 112.0904 | 7.5348 | 1777.4358 | 8907.6800 | -0.7773 | 762.8943 | 57.1724 | 3.263 | -0.0473 | 1004.8054 | 39.4139 | 95 | 144.2 | 126.0 | 64.0999 | 295.5630 | 7.99 | 7.134 | 13.663 | 0.37 | 5803 | 3530 | 4483 | 265 | 2.9 | 0.9 | 14.57 | 7.34 | 24.174 | 5.36 | 30.085 | 0.252 | 10.40 | 3.10 | 63.971 | 2.7716 | 139.5020 | 1388.2002 | 143.3508 | 32.201 | 2.2894 | 3.5078 | 5.9490 | 436.0000 | 0.0000 | 57.3640 | 1.0833 | 2.7009 | 218.3634 | 326.1538 | 40.0990 | 60.9836 | 1.6445 | 3.7873 | 21.2039 | 15.1805 | 174.1510 | 21.0437 | 0.0000 | 201.3399 | 0.0000 | 111.6582 | 128.0311 | 41.2447 | 560.6957 | 203.1655 | 2.3552 | 3.5795 | 0.0922 | 97.5149 | 336.5385 | 0.0000 | 263.4538 | 0.0000 | 3.3623 | 0 | 0.2576 | 5.4568 | 2.2892 | 4.5062 | 1.5106 | 7.116 | 0.7514 | 401.488 | 73.854 | 17.42 | 12.4295 | NaN | NaN | NaN | 536.1118 | 8.5300 | 1.5352 | 100.2745 |
| 16 | 3028.02 | 2560.87 | 2270.2556 | 1258.4558 | 1.3950 | 104.8078 | 195.1742 | 12.1805 | 428.9826 | 9.1999 | 185.9743 | -5757.25 | -3654.50 | -2445.00 | 7.2435 | 63.3556 | 2.9667 | 3.4262 | 86.3472 | 9.1283 | 86.2537 | 73.95 | 3.860 | 362.1782 | 146.8118 | 139.6600 | 253.1952 | 2845 | -8.1182 | 118.6936 | 16.6636 | 28.1182 | 0.8885 | 150.2055 | 92.1660 | 6.4414 | 1717.1081 | 8581.6799 | -0.0954 | 729.7498 | 59.1715 | 3.263 | -0.0473 | 1004.8054 | 39.4139 | 85 | 189.6 | 102.3 | 48.0000 | 180.0040 | 4.94 | 6.331 | 16.249 | 0.49 | 601 | 199 | 643 | 1748 | 3.1 | 1.1 | 20.32 | 10.02 | 20.532 | 6.87 | 49.321 | 0.270 | 15.70 | 2.59 | 113.118 | 3.2411 | 181.5117 | 606.2998 | 55.5248 | 32.201 | 2.2894 | 2.5691 | 6.2150 | 440.5990 | 0.0000 | 51.9767 | 1.4197 | 2.1985 | 17.5947 | 71.4928 | 37.0787 | 48.6739 | 0.4215 | 3.7038 | 19.4794 | 31.9037 | 94.2171 | 20.9306 | 994.1094 | 228.6286 | 981.4324 | 0.0000 | 198.3728 | 278.5714 | 210.3929 | 224.1206 | 1.6780 | 3.5696 | 0.6162 | 55.6000 | 725.1064 | 0.0000 | 68.5315 | 444.4444 | 2.5685 | 0 | 0.1001 | 6.1799 | 1.5859 | 5.9353 | 1.5106 | 7.116 | 1.2108 | 409.186 | 76.022 | 17.03 | 53.1570 | 263.276 | 5.13 | 19.9909 | 537.8145 | 8.4300 | 2.1574 | 82.0989 |
| 17 | 3032.73 | 2517.79 | 2270.2556 | 1258.4558 | 1.3950 | 104.8078 | 195.3425 | 10.0002 | 420.4726 | 9.4147 | 185.9278 | -5612.50 | -3617.50 | 514.75 | 3.3272 | 60.2444 | 2.2667 | 3.6651 | 85.0465 | 8.5762 | 87.4811 | 74.04 | 4.768 | 361.4573 | 147.8655 | 140.1191 | 255.6310 | 2845 | -5.5327 | 122.3327 | 9.6391 | 25.5327 | 0.9808 | 153.6109 | 110.5151 | 8.0131 | 1777.3281 | 9100.6200 | 0.6241 | 769.1809 | 58.6493 | 3.263 | -0.0473 | 1004.8054 | 39.4139 | 99 | 188.2 | 158.7 | 48.0000 | 180.0040 | 4.94 | 5.949 | 15.490 | 0.49 | 1653 | 346 | 1216 | 1895 | 2.5 | 1.0 | 19.18 | 7.42 | 20.421 | 3.21 | 53.933 | 0.270 | 11.45 | 4.90 | 68.416 | 2.9399 | 120.1371 | 1253.3994 | 123.2961 | 32.201 | 2.2894 | 2.5691 | 6.5844 | 288.6739 | 0.0000 | 59.4889 | 1.3003 | 5.1695 | 33.6144 | 368.1399 | 44.1176 | 53.6558 | 1.6919 | 2.1334 | 21.0980 | 88.5639 | 118.7873 | 10.2185 | 692.0635 | 464.8649 | 66.6667 | 368.6410 | 66.9031 | 0.0000 | 329.5533 | 550.4112 | 1.1747 | 1.7598 | 0.1540 | 28.2881 | 813.8728 | 927.4576 | 89.7338 | 0.0000 | 2.2532 | 0 | 0.0672 | 6.6861 | 1.2866 | 5.7970 | 1.5106 | 7.116 | NaN | NaN | NaN | NaN | 15.8112 | 263.682 | 2.11 | 18.6964 | 531.8418 | 10.2600 | 2.0979 | 82.0989 |
df1.isnull().sum().nunique()
3
df1.isnull().any().any()
True
df7 = df1.replace(np.NAN, 0)
val2=df7.copy()
val2=val2.reset_index(drop=True)
#fitting Random Forest with threshold
# store the predicted probabilities for failed class
y_pred_prob = rf_grid1.predict_proba(val2)[:, 1]
# predict fail if the predicted probability is greater than 0.1688
pred = binarize([y_pred_prob], 0.1688)[0]
val2['Pass/Fail'] = pred
val2 = val2[(val2['Pass/Fail'] == 1)]
val2.head(5)
| 0 | 1 | 2 | 3 | 4 | 6 | 12 | 14 | 15 | 16 | 18 | 21 | 23 | 24 | 27 | 28 | 29 | 31 | 32 | 33 | 38 | 40 | 41 | 43 | 45 | 48 | 51 | 55 | 59 | 62 | 63 | 64 | 67 | 68 | 71 | 83 | 88 | 90 | 98 | 115 | 117 | 122 | 129 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 142 | 150 | 151 | 155 | 159 | 160 | 161 | 162 | 166 | 167 | 180 | 182 | 183 | 185 | 188 | 195 | 200 | 201 | 208 | 218 | 223 | 225 | 250 | 268 | 269 | 416 | 417 | 418 | 419 | 423 | 426 | 429 | 432 | 433 | 438 | 439 | 442 | 453 | 460 | 468 | 472 | 476 | 482 | 483 | 484 | 485 | 486 | 487 | 488 | 489 | 491 | 493 | 494 | 496 | 499 | 500 | 510 | 511 | 520 | 521 | 523 | 525 | 526 | 527 | 539 | 545 | 546 | 547 | 548 | 550 | 561 | 562 | 564 | 569 | 570 | 572 | 585 | 589 | Pass/Fail | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 3030.93 | 2564.00 | 2187.7333 | 1411.1265 | 1.3602 | 97.6133 | 202.4396 | 7.9558 | 414.8710 | 10.0433 | 192.3963 | -5419.00 | -4043.75 | 751.00 | 3.0490 | 64.2333 | 2.0222 | 3.5191 | 83.3971 | 9.5126 | 86.9555 | 61.29 | 4.515 | 352.7173 | 130.3691 | 141.2282 | 218.3174 | 2834 | -1.7264 | 108.6427 | 16.1445 | 21.7264 | 0.9226 | 148.6009 | 84.0793 | 7.2163 | 1747.6049 | 8671.9301 | 0.3974 | 748.6115 | 58.4306 | 2.639 | -0.0473 | 1000.7263 | 39.2373 | 123 | 111.3 | 75.2 | 46.2000 | 350.6710 | 6.78 | 4.271 | 10.284 | 0.41 | 1017 | 967 | 1066 | 368 | 2.0 | 0.9 | 20.95 | 12.49 | 16.713 | 5.72 | 65.363 | 0.292 | 10.30 | 5.38 | 97.314 | 3.4789 | 175.2173 | 1940.3994 | 219.9453 | 40.855 | 4.5152 | 2.7380 | 5.9846 | 525.0965 | 0.0000 | 53.6840 | 1.7275 | 3.6084 | 26.3617 | 49.0013 | 44.5055 | 42.2737 | 1.1975 | 3.2698 | 29.9394 | 311.6377 | 63.7987 | 31.9893 | 613.3069 | 291.4842 | 494.6996 | 178.1759 | 843.1138 | 0.0000 | 53.1098 | 0.0000 | 0.7578 | 2.9570 | 2.1739 | 17.1202 | 0.0000 | 0.0000 | 64.6707 | 0.0000 | 1.9864 | 0 | 0.1094 | 3.1406 | 0.5064 | 6.6926 | 2.0570 | 7.116 | 1.0616 | 395.570 | 75.752 | 12.93 | 42.3877 | 0.000 | 0.00 | 0.0000 | 533.8500 | 8.95 | 2.3630 | 0.0000 | 1.0 |
| 1 | 3095.78 | 2465.14 | 2230.4222 | 1463.6606 | 0.8294 | 102.3433 | 200.5470 | 10.1548 | 414.7347 | 9.2599 | 191.2872 | -5441.50 | -3498.75 | -1640.25 | 7.3900 | 68.4222 | 2.2667 | 3.4171 | 84.9052 | 9.7997 | 87.5241 | 78.25 | 2.773 | 352.2445 | 133.1727 | 145.8445 | 205.1695 | 2853 | 0.8073 | 113.9800 | 10.9036 | 19.1927 | 1.1598 | 154.3709 | 82.3494 | 6.8043 | 1931.6464 | 8407.0299 | -0.9353 | 731.2517 | 58.6680 | 2.541 | -0.0946 | 998.1081 | 37.9213 | 98 | 80.3 | 81.0 | 56.2000 | 219.7679 | 5.70 | 6.285 | 13.077 | 0.35 | 568 | 59 | 297 | 3277 | 2.2 | 1.1 | 17.99 | 10.14 | 16.358 | 6.92 | 82.986 | 0.222 | 8.02 | 3.74 | 134.250 | 3.9578 | 128.4285 | 1988.0000 | 193.0287 | 29.743 | 3.6327 | 3.9300 | 9.0604 | 0.0000 | 368.9713 | 61.8918 | 1.4857 | 3.1595 | 8.4887 | 199.7866 | 48.5294 | 37.5793 | 1.9562 | 4.3737 | 40.4475 | 463.2883 | 73.5536 | 30.8643 | 0.0000 | 246.7762 | 0.0000 | 359.0444 | 130.6350 | 820.7900 | 194.4371 | 0.0000 | 3.6822 | 3.2029 | 0.1441 | 12.6788 | 0.0000 | 0.0000 | 141.4365 | 0.0000 | 1.6292 | 0 | 0.0673 | 3.1310 | 0.8832 | 8.8370 | 1.7910 | 7.116 | 1.3526 | 408.798 | 74.640 | 16.00 | 18.1087 | 0.000 | 0.00 | 0.0000 | 535.0164 | 5.92 | 4.4447 | 208.2045 | 1.0 |
| 2 | 2932.61 | 2559.94 | 2186.4111 | 1698.0172 | 1.5102 | 95.4878 | 202.0179 | 9.5157 | 416.7075 | 9.3144 | 192.7035 | -5447.75 | -4047.00 | -1916.50 | 7.5788 | 67.1333 | 2.3333 | 3.5986 | 84.7569 | 8.6590 | 84.7327 | 14.37 | 5.434 | 364.3782 | 131.8027 | 141.0845 | 185.7574 | 2936 | 23.8245 | 115.6273 | 11.3019 | 16.1755 | 0.8694 | 145.8000 | 84.7681 | 7.1041 | 1685.8514 | 9317.1698 | -0.1427 | 718.5777 | 58.4808 | 2.882 | -0.1892 | 998.4440 | 42.0579 | 89 | 126.4 | 96.5 | 45.1001 | 306.0380 | 8.33 | 4.819 | 8.443 | 0.47 | 562 | 788 | 759 | 2100 | 2.1 | 1.4 | 17.78 | 13.31 | 22.912 | 9.21 | 60.110 | 0.139 | 16.73 | 5.09 | 79.618 | 2.4266 | 182.4956 | 839.6006 | 104.4042 | 29.621 | 3.9133 | 3.0609 | 5.2231 | 0.0000 | 0.0000 | 50.6425 | 1.8268 | 3.5220 | 18.7546 | 109.5747 | 60.0000 | 70.9161 | 0.4264 | 7.5418 | 32.3594 | 21.3645 | 148.0287 | 13.3923 | 434.2674 | 151.7665 | 0.0000 | 190.3869 | 746.9150 | 74.0741 | 191.7582 | 250.1742 | 1.0281 | 3.9238 | 1.5357 | 18.9849 | 0.0000 | 0.0000 | 240.7767 | 244.2748 | 2.9626 | 0 | 0.0751 | 12.1831 | 0.6451 | 6.4568 | 2.1538 | 7.116 | 0.7942 | 411.136 | 74.654 | 16.16 | 24.7524 | 267.064 | 1.10 | 68.8489 | 535.0245 | 11.21 | 3.1745 | 82.8602 | 1.0 |
| 3 | 2988.72 | 2479.90 | 2199.0333 | 909.7926 | 1.3204 | 104.2367 | 201.8482 | 9.6052 | 422.2894 | 9.6924 | 192.1557 | -5468.25 | -4515.00 | -1657.25 | 7.3145 | 62.9333 | 2.6444 | 3.3813 | 84.9105 | 8.6789 | 86.6867 | 76.90 | 1.279 | 363.0273 | 131.8027 | 142.5427 | 189.9079 | 2936 | 24.3791 | 116.1818 | 13.5597 | 15.6209 | 0.9761 | 147.6545 | 70.2289 | 7.5925 | 1752.0968 | 8205.7000 | 0.0177 | 709.0867 | 58.6635 | 3.132 | 0.2838 | 980.4510 | 41.1025 | 127 | 118.0 | 123.7 | 47.8000 | 162.4320 | 5.51 | 9.073 | 15.241 | 0.35 | 859 | 355 | 3433 | 3004 | 1.7 | 0.9 | 16.22 | 14.67 | 22.562 | 5.69 | 52.571 | 0.139 | 13.56 | 5.92 | 104.950 | 5.5398 | 152.0885 | 820.3999 | 94.0954 | 31.830 | 3.1959 | 2.4643 | 7.6602 | 317.7362 | 0.0000 | 94.4594 | 1.5441 | 4.9898 | 76.0354 | 181.2641 | 34.0336 | 41.5236 | 0.4097 | 6.9785 | 27.6824 | 24.2831 | 100.0021 | 35.4323 | 225.0169 | 100.4883 | 305.7500 | 88.5553 | 104.6660 | 71.7583 | 0.0000 | 336.7660 | 1.7670 | 3.1817 | 0.1488 | 29.2542 | 0.0000 | 711.6418 | 113.5593 | 0.0000 | 2.4416 | 0 | 0.0977 | 6.7553 | 0.7404 | 6.4865 | 2.1565 | 7.116 | 1.1650 | 372.822 | 72.442 | 131.68 | 62.7572 | 268.228 | 7.32 | 25.0363 | 530.5682 | 9.33 | 2.0544 | 73.8432 | 1.0 |
| 4 | 3032.24 | 2502.87 | 2233.3667 | 1326.5200 | 1.5334 | 100.3967 | 201.9424 | 10.5661 | 420.5925 | 10.3387 | 191.6037 | -5476.25 | -3987.50 | 117.00 | 7.2748 | 62.8333 | 3.1556 | 3.2728 | 86.3269 | 8.7677 | 86.1468 | 76.39 | 2.209 | 353.3400 | 176.3136 | 138.0882 | 233.5491 | 2865 | -12.2945 | 144.0191 | 21.9782 | 32.2945 | 0.9256 | 146.6636 | 65.8417 | 7.5017 | 1828.3846 | 9014.4600 | -0.6704 | 796.5950 | 58.3858 | 3.148 | -0.5677 | 993.1274 | 38.1448 | 119 | 143.2 | 123.1 | 48.8000 | 296.3030 | 3.64 | 9.005 | 12.506 | 0.43 | 699 | 283 | 1747 | 1443 | 3.9 | 0.8 | 15.24 | 10.85 | 37.715 | 3.98 | 72.149 | 0.250 | 19.77 | 5.52 | 92.307 | 4.1338 | 69.1510 | 1406.4004 | 149.2172 | 19.862 | 3.6163 | 3.3208 | 4.2178 | 0.0000 | 866.0295 | 85.2255 | 1.2943 | 3.8754 | 43.8119 | 0.0000 | 25.3521 | 37.4691 | 0.7198 | 2.7092 | 30.8924 | 44.8980 | 89.9529 | 42.6838 | 171.4486 | 276.8810 | 461.8619 | 240.1781 | 0.0000 | 587.3773 | 748.1781 | 0.0000 | 2.2358 | 3.2712 | 0.0372 | 107.6905 | 293.1396 | 0.0000 | 148.0663 | 0.0000 | 2.5512 | 0 | 0.0616 | 2.9954 | 2.2181 | 6.3745 | 2.0579 | 7.116 | 1.4636 | 399.914 | 79.156 | 19.63 | 22.0500 | 0.000 | 0.00 | 0.0000 | 532.0155 | 8.83 | 99.3032 | 73.8432 | 1.0 |